Probably not wise to hide the business address, it might impact on how it ranks from a local point of view.
The only reason I can see someone would want to hide an address is if they wanted to shut down a location or move a location.
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Probably not wise to hide the business address, it might impact on how it ranks from a local point of view.
The only reason I can see someone would want to hide an address is if they wanted to shut down a location or move a location.
If you are using niche specific directories I think they will be fine.
The directories you need to stay away from are sites like freeSEOdirectory.info
Do your research and see if the site is active, if they have quality content, if they have an approval process on new business owners been added.
It does sound like the site was legacy and the update did take too long to occur, we have seen this same thing with other business owners on legacy CMS's who have not moved over quickly enough so it is a common issue.
I guess the thing you need to think about is the following -
1. Were all the links disavowed pure spam, most of the profile looks branded, We have seen SEOs disavow quality in the past which can have an adverse impact on the domain (see attached) I presume the disavow may have been mostly legacy spam.
2. The domain may have been hit by Panda in the past (see attached) though without seeing internal data it is hard to give a more accurate analysis.
Hope this helps,
James
The thing with running a article based site it is not wise to use articles which are already indexed on other sites, this will cause the pages not to be indexed on your site and you will receive no organic traffic.
To prevent something like this you would need to use the Copyscape API on your site and pre check all content which is added to the site prior going live - http://www.copyscape.com/api-guide.php
The API calls are not cheap though this is a viable option for checking large amounts of content on a daily basis.
I would No Index TAG pages on your blog/website. The SEO benefit from these tags is limited the userbility benefit is evident.
Using tag pages is still not a bad idea as they can help with fine tuning your category level targeting.
Also the attached image is a great example to compare the two from this post - https://moz.com/blog/setup-wordpress-for-seo-success
I guess you could push the "Located in Canada" message on your domain, and on the meta description for example to increase CTR. If you want to increase authority and ranks it is a question for the whole domain.
This is an issue you will see in all markets, even in Australia for example .com.au is country specific TLD will do well, yet US companies on .com will still rank for long tail. You just need to push the "Buy local" message in your titles/ Descriptions where possible. My advice is test it and see what happens.
You could have implemented the Canonical tag on the site to stop any cross site duplicate content issues.
The only time I would remove URL'S is if you have a staging site with the same content on a different URL for example.
Upload slideshows on Slideshare.net Google loves the domain from what I have seen, if you have a great range of links to the Slideshare content you can have them ranking for great terms.
If you upload slideshow content to your own domain it may not be as effective.
1. MajesticSEO is another decent link tool, Ahrefs is very good as well.
2. I like White Spark for citation finding.
3. Optimizely is good with CRO it depends on your budget tools go from one price to VERY expensive some are 4,000+ a month.
4. Hootsuite for social management is good, I also like Topsy for social monitoring.
5. Wordpress is good if you use the Genesis framework ?
You can track beyond 50 with tools like AWR - (Advancd Web Ranking) you may need a proxy to run it depend on the number of keywords you want to track.
If you want estimate rankings for free then you can use (Google Webmaster Tools Ranking data or SEM Rush) which will show ranking data. That been said the data from SEM Rush and from GWT will be limited and will only be taken as a estimate.
To be honest I tested the tool by Citation Labs and found it to be quite poor for broken link building.
One of the best guides is found here - http://www.quicksprout.com/the-advanced-guide-to-link-building-chapter-7/
Further to that if you have a full licence to Screaming Frog try and use the second tab for external links and scrap whole websites and look for broken external links.
Overall broken links is a difficult area yet if you make the right contacts it can be really great, we have scored some .gov and .edu links in the past.
Also check how much traffic the tags are currently getting, one site I have looked at in the past had like 16k uv a month from some tags on the site so proceed with caution also I agree with the advice above as well.
Personally I do not mind this website here http://www.theiconic.com.au they are pretty switched on from an SEO & eCommerce point of view and user testing ect, some things I would change yet they have a good basis for what you are asking.
I mean most of the bigger sites like Amazon ect they are legacy so things need to be changed over time, I would look at new eCommerce sites which have been built up from the ground up with SEO in mind. Usually this is not the case and it involved fixing things over time/ adding on things
To be honest if it is all on the same sever with the same hosting information ect, if it not going to do much benefit from a long term point of view. I tested your site in one sever tracking tool and it shows up all the sites on the same hosting IP been 98% similar (Google would see the same data) that answers the reply above is it on the same C-Class.
| <a id="mfa99" class="domain-name"></a>[+] <a id="mfa100" class="domain-name"></a>kidsrcrafty.com | 98% <a id="mfa101" class="expand-indicator expand"></a>reasons why | | > 1 million |
| <a id="mfa115" class="domain-name"></a>[+] <a id="mfa116" class="domain-name"></a>www.dltk-kids.com | 98% <a id="mfa117" class="expand-indicator expand"></a>reasons why | | > 1 million |
| <a id="mfa137" class="domain-name"></a>[+] <a id="mfa138" class="domain-name"></a>dltk-bible.com | 98% <a id="mfa139" class="expand-indicator expand"></a>reasons why | Jan. 26, 2002 | 96566 |
| <a id="mfa154" class="domain-name"></a>[+] <a id="mfa155" class="domain-name"></a>dltk-teach.com | 98% <a id="mfa156" class="expand-indicator expand"></a>reasons why | Nov. 2, 2002 | 115330 |
| <a id="mfa172" class="domain-name"></a>[+] <a id="mfa173" class="domain-name"></a>dltk-holidays.com | 98% <a id="mfa174" class="expand-indicator expand"></a>reasons why | Oct. 16, 2002 | 230480 |
| <a id="mfa188" class="domain-name"></a>[+] <a id="mfa189" class="domain-name"></a>coloring.ws | 98% <a id="mfa190" class="expand-indicator expand"></a>reasons why | | 43523 |
| <a id="mfa204" class="domain-name"></a>[+] <a id="mfa205" class="domain-name"></a>kidzone.ws | 98% <a id="mfa206" class="expand-indicator expand"></a>reasons why | | 55624 |
| <a id="mfa218" class="domain-name"></a>[+] <a id="mfa219" class="domain-name"></a>dltk-poems.com | 98% <a id="mfa220" class="expand-indicator expand"></a>reasons why | | > 1 million |
| <a id="mfa229" class="domain-name"></a>[+] <a id="mfa230" class="domain-name"></a>dltk-enfants.com | 54% <a id="mfa231" class="expand-indicator expand"></a>reasons why | | > 1 million |
| <a id="mfa235" class="domain-name"></a>[+] <a id="mfa236" class="domain-name"></a>makinglearningfun.com | 52% <a id="mfa237" class="expand-indicator expand"></a>reasons why | April 2, 2006 | 178965 |
| <a id="mfa241" class="domain-name"></a>[+] <a id="mfa242" class="domain-name"></a>dltk-ninos.com |
I think the main question with link removal is the following:
Are you trying to get out of a Manual or Algo penalty.
1. If the link is no follow - do not worry about it (many of these junk sites are no follow)
2. If the site is using a BRANDED anchor text or a URL anchor text on it I would not worry to a degree.
3. The main links you need to look at are GENERIC anchor text based links, the thing with these dodgy submission sites is if they have generic follow links to your site then yeah I would try to delete them and disavow.
It is really a case by case scenario, another think you can do is use the Majestic Bulk backlink checker on the URL level for any link removal you can check 150 urls at a time so it is good on the fly.
To be honest Moz and even Raven and the others to a certian degree are only going to give you one level of KW data. If you are really keen on Keyword Research and want to do it right their are custom tools on the market to do just that an example of this is http://keywordsnatcher.com/
That been said MOZ tool set has the Keyword difficulty section http://moz.com/tools/keyword-difficulty which I guess is kind of what you need.
Really the question is HOW DEEP do you want to do keyword Research?
I have used Host Gator and GoDaddy in the past. To be honest I find Host Gator to be a better for what I needed at the time.
Overall these three solutions are entry level hosting providers.
My Advice is the following:
1. Check how much traffic is coming from this section, you can do this in landing page analysis on Google Analytic's or the tracking you use.
If you are getting a decent amount of traffic from these articles even if its long tail I would think of another strategy before slapping on a no index. Because when you do the traffic will go.
I have dealt with a similar strategy for a news website in the past, what many of the big syndication players do is take duplication content to rank on Google News for 30-60 days then they 404 the page, I have seen this numerous times, I do not know how viable the strategy is overall.
Ive also noticed some news websites play around with Canonical tags via various partners on duplication content and yes they also do some no indexing.
Really research this before you implement it, I have done a bit of News SEO for Australian sites its an interesting area with limited information online.
To be honest I wouldn't use Crazy Domains, they are a real night mare if you want to transfer domains, I have also seen many companies let Crazy Domains inventory expire as the auto renew set up is not great.
As stated above I would go with a Company like Ventra IP or Netfleet (these guys really know domains)
It is not just search query volumes local results are affecting.
I have a strong feeling it is playing on CTR across search results as well. I have seen some research on local keywords where we see the position 10 for a GEO term get 20% CTR overall.
But yeah things are changing with the way local results display and this change has been rolled out over the last few months and if not years. But the thing is Google also tests and changes results as well.
Well the thing about the dot BIZ site is that they are manly targeting this keyword alone, and they are not worrying about any other keywords.
Google still does give some favor to EMD (exact match domains) with targeting keywords.
A few other tips:
1. Look at the backlinks they have used and try to copy some for your keywords.
2. Try and move the AADL keyword to first in your title.
3. Internally link to AADL within your website to make use of your internal link juice.
4. Build backlinks to your page but mix up the anchors with brand/ url/ generic ect.
Keep adding new content.
This way you should rank for AADL
To be honest if I was going to roll out a theme and it was used by 10,000 people I would worry if the anchor text is the same on 10,000 sites. I know when Penguin 1.0 rolled out many many websites which footer links on a massive scale were hit VERY hard. Their were even case studies from wordpress theme owners on Moz.
The key element here is you are worried is to make the footer link no follow.
Where to distribute it, depends if you want to push it for free or to make money from it, Theme Forest is a good start, their are 100s of free theme sites in the market.
A few things to think about.
1. I usually crawl the old site with screaming frog software, to build by URL list.
2. I map out the URLs to related new pages on the site, any thing non related just 301 too the home page or sub category section.
3. Track your top converting keywords from the old site before and after the move too note changes.
4. Another too look at is internal links to your site from external sites, if you have some highly authority links it could be worth changing them to new URLS.
5. Re crawl the site after the launch to look for any errors pages ect, I wouldn't personally USE GWT for something like this I would use Deep Crawl or Screaming Frog.
Another method I have been using for a while, just to check the number of indexed images:
1. Search for site:www.website.com
2. Scroll to the bottom of the page, and hit "switch to basic version"
3. It will then show the total number of indexed images via Google.
I am sure their are other ways you can do it via crawlers and analytic's reports too.
Another thing you can do to test your ranking is to use the Google Ad Preview tool on non logged in, compare the results with what comes up: [https://adwords.google.com/d/AdPreview/?__u=1000000000&__c=1000000000
N](https://adwords.google.com/d/AdPreview/?__u=1000000000&__c=1000000000)o ranking tool is 100% at the moment, even AWR with proxy setup.
A few options:
1. As david said make the page a 401 page.
2. Try to remove the links on scale, review why they are comming in i.e same IP address, same who is, request sites to remove them, if they don't remove add them to the disavow.
I wouldn't 301 pages this will just transfer the problem to a new websites, ive seen numerous cases where domains have been hit because of cross site 301's.
Two options.
Option 1 - 301 redirect to a similar sub category level pages for that product.
Option 2 - Set up a landing page which states this product is no longer for sale, and show up similar items for sale. this is a common strategy used by group buying sites.
It does happen from time to time with GWT where you see a large loss in backlink data.
My advice is to track your backlinks profile with OSE, Majestic and Ahrefs. It is best to monitor numerous sources rather than just relying on one source, as GWT will only ever show a sample of your full link profile.
It is worth making a Press or in the news page on the website, if the website is authority you can also have logos on the home page of the site saying "as seen in" it is more a trust thing for buyers.
To be honest is is not a wise thing to do where you have a working website with good PR and links and to park it/ take down all the content and then the domain goes out of the index.
My advice is to keep that site live until the new site is ready to launch, on that day implement 301 mapping from the old site to the new site.
The thing with Google Australia is that it tends to favor the following things, I have worked in the Australian and US markets for a long time so I see this day in day out.
1. A website which is using a local domain .com.au or .net.au - Australian specific domain names (if you are running off a .com domain make a sub folder or sub domain for the region.
2. Website which is hosted locally in Australia (not as important as above)
3. Local pages and content based for each specific region example - Sydney, Melbourne and Brisbane based content driven pages. (From recent testing Google seems to be placing some focus on local based content for each specific region)
4. Local links and citations also help the overall process.
open site explorer only update the index every month, you can view the updates on the following URL: https://seomoz.zendesk.com/entries/345964-linkscape-update-schedule
When the index is updated the link metrics should change.
Depends how "well known" the agency is, I have taken one 2 clients from a local agency which is quite large they have about 40 staff locally. We asked them to remove a large number of links from their link networks and directories they own. They actually removed most of the links in question over a few weeks periods.
The process we used was to have the client contact the agency direct with some wording such as "We realized you have been using link networks to do link building (see attached report of network links) this has resulted in a serious drop in revenue and traffic. We request that you remove the attached links in the report from your network."
This worked with two prior clients, they did clean up most of the damage, but that been said some other SEO companies would not do the same.
To be honest I would focus my efforts into making one or two really good websites rather than making 15 different websites.
Using exact match domains on a larger scale is a lot different in today's market in comparison to say 2 years ago where you could make a 5 pages site and get the EMD ranking.
If you do go with a 15 site strategy, you need to make 15 x sites, 15 x content pieces to make sure its unique, 15x splitting up hosting I would not even use any tracking that can link up the sites imo.
A few factors you need too look at:
1. the page where the link is their are many out going signature links (spammy)
2. Whilst the domain is strong the page where the link is going to be located is not the best.
Overall I do not deem this to be a high quality link, in the eyes of penguin forum signature spam is something they are looking to target.
Their are plenty of legitimate ways to build links check out:
http://searchenginewatch.com/article/2064922/131-Legitimate-Link-Building-Strategies
http://backlinks.com.au/15-ways-to-build-backlinks-post-google-penguin-update/
Hope this helps
I still see related searches showing, but I have noticed Google has been playing around with numerous tests in this space recently.
That been said if they take it away it is not the end of the world as some of this data can still be taken from KW research and also auto suggest.
But yes their have been some "tests" by Google in this area that is for sure.
Hi Mate,
I actually live in Australia, at the moment its not hard to get work at an agency, but it all depends what type of Visa you are coming into the country on? Working Holiday/ 457/ ect ect these are questions I would start thinking about.
Last year I was in Europe for 3 months, I have a few consulting clients and numerous web businesses, I just told the clients that I will be going away for 3 months and that if it was possible we could put things on hold for 3 months and get some of my contractors to work 3 days a month on the client to kick over the work.
I also checked in with emails and what not when I was on holidays and recovering from a big night out.
I can tell you this but from Australia:
1. Sydney - Heaps of SEO work.
2. Melbourne - a fair bit of SEO work.
3. Brisbane - limited SEO work.
Then most other city's it is limited as Australia's business is primarily done via these main city's.
DMOZ links and directory links in general will not hurt the website if they are decent quality.
Last October I interviewed a Google Search Quality employee who confirmed this and also confirmed high quality directory's are still fine to use: http://jamesnorquay.com/an-interview-ex-member-matt-cuttss-search-quality-team/
But that been said out of the 130 lower quality sites I would check them over to first see if they are indexed and then to also see if they have PR on page and also the general style of the directory. If you are concerned I would look at manually removing them if possible via out reach or if worst case scenario and they are very low quality I would use the Disavow tool.
To be honest if you have a high quality site with great content and then you come in and fill it with ads it may not be the best strategy.
In regards to link wheels the strategy has developed over the years, where people have the money site and they build content hubs on say 40 different websites and all spam back links to the main site, it is probably not the best strategy and Google has been aware of it for a long while.
It depends on how high quality the site is if it is going to be hit or not, if the site is really high quality with a huge link profile it may be able to by pass a Google penalty.
In the end of the day Google can crack down at any time no matter how large the site is look at the BBC News story in the media today.
No problem mate, here is a reply we received:
_I am so sorry that you are running into some issues with our web tools. Rankings, Rank Tracker and our Keyword Difficulty tool all ran into these unexpected outage issues. Unfortunately, Google has changed the way that they present rankings information, so we have had to make some changes to our collection in order to keep up with these changes, I'm afraid that this has caused some issues with providing the real time data for our Rankings and the Keyword Difficulty tool is also currently unresponsive. Our engineers are working tirelessly through this holiday weekend to fix this for you. At this time, unfortunately, we do not have an ETA for when that tool will be up and running again. I thank you for your patience while we work to resolve this issue and I will update you when I have any further information for you. I am hoping for an ETA soon. _
Also from reading they are saying possibly by late Friday it should be back up and running lets hope so
Keyword tool has been down for a few days, I use it daily on my work account.
Thread about this problem here:
http://www.seomoz.org/q/seomoz-keyword-difficulty-tool-been-down-for-a-few-days
their is also a dev area for updates here:
https://seomoz.zendesk.com/entries/22457872-keyword-difficulty-and-rank-tracker-issues
hope it is back up soon, I know the devs at SEOmoz are working hard to fix the issues, but I need to finish off some work
To be honest I would probably go with the option: "MOSTLYMATCHINGKEYWORDcompany.com"
In my eyes .com is still king in the domain world on a global level,
.com means commercial and for business.
I have had some decent success with .net over the years, but in the end of the day the best success I have seen is from .com domains and .com.au (my market)
Most big business go with the .com domain too, that is why they are still so expensive.
My advice is to wait for the reply from Google after they look at the reconsideration request, it can take longer then 3 weeks from my experience.
Furthermore if they knock it back again you may have to export all your links in webmaster tools and go over them one by one too see what is paid or non natural, annoying but these are the lengths Google are going to in today's market.
Sites can recover, I have seen a few who have been hit by Penguin and after 3 or 4 reconsideration requests they have come back somewhat in the ranks, it just takes time.
No title tag from my experience would be below 68 characters including spaces,
If you do a title tag between 70-80 characters including spaces it will be too long and be cut off.
Google is pretty good at picking up a network on sites within the same c-class sever.
That been said think of the following methodology, if you acquire 4 links on newspaper websites on the same sever and all those links are very high authority then it will provide 4 different links all with good value.
so that been said if you can acquire 4 different site links on 4 high quality sites on the same sever it is worth it for sure.
But be wary if some one goes to you i will sell you 50 site links on 1 sever all with low authority sites I would steer clear as it is a link network.
Well the first question is:
Are they looking to buy 36 different domains and run GEO specific sites in 36 countries or are they looking to use one main site and build GEO layers off one main site.
I have developed numerous strategy's in the past around 20 Geo levels but in the end of the day you really need to answer the above question first as their is really two ways this project can do based on the clients request.
I saw a thread on Quora about this, the interesting thing was a ex Facebook SEO actually came in and posted some strategies around methadologies in which Facebook could potentially use to take their SEO to the next level but were deemed to aggressive:
If I worked in their dept, I would be thinking on big levels as the post above, you cant really think on a level of 10 meta tags you need to think of big site SEO and dealing with millions of pages and the tactics that can be designed from such strategies.
If you have an enterprise site with many 1000 pages and many problems and many many natural links and you only do on page you can see some good results.
But if you have a small business site with no natural external links and you do only on site I doubt you will see much in terms of results. that been said you may push out 100s of fantastic articles yet will the consumers see these articles?
The thing is SEO software is not going to do SEO for you it is only going to assist you with processes and make things quicker.
To me I think SEOmoz is the best at the moment for what they offer, i.e OSE, ranking tracking, errors tracking, followerwonk, link finder ect ect
Raven is good too but SEO moz is better then Raven in my eyes,
You then have 100s of other tools I mean for Enterprise level, for small business, for link building.
Best way to build good quality links to make good quality content and then seed the content to the wider audience.
If you want other metohods here are 5:
Industry specific directories
Guest blog posts on related websites
Social Media posts on higher quality sites
Creating a high quality forum profile, been a member on the site then posting about your site.
making an info graphic for your industry and then going and posting it on social sites.