Also having this problem. Emailed help@moz.com.
Posts made by CatalystSEM
-
RE: Unable to download OSE Backlink attachment. Showing error
-
RE: Geolocate or not?
Hello Bilal,
Instead of geo-locating through Google Webmaster Tools, use "hreflang" to target your website to visitors in a certain region.
The rel="alternate" hreflang="x" annotations will help Google serve the correct language or regional URL to searchers. More information is provided here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 and here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192&topic=2370587&ctx=topic.
Please let me know if this helps.
-
RE: Is there any SEO value to Infographs?
Simply creating the infographics won't create value for you. You have to go out and promote them to build social links e.g. tweets, shares, comments and links (when people share them).
Does this help?
-
RE: Where to look...
Inhouse is cheaper for you in the long run but Agency allows you to draw upon another company's expertise and will usually cost a lot more.
Go online to some of the best SEO companies in your area and check out their career sections. You'll know how to write out the job description based on theirs.
Salary information is relatively easy to find. You can ask around or check out that SEOmoz Salary guide for SEOs and SEMs.
-
RE: AJAX and Bing Indexation
I recommend doing as the Bing Engineers say. Since you have the same content in both AJAX and non-AJAX, it is in your best interest to serve the content in a way that both Search Engine Crawlers and Users benefit.
The best way to do so is by sending Search Engines to the non-AJAX / static version and sending users to the AJAX version. I'm a little surprised that only Bing has a problem and Google does not for you because Google usually requires the AJAX Crawling Protocol in order to index AJAX.
Please let me know if this helps. I used to have an identical solution on one of my accounts and this resolved it.
-
RE: Penalised due to links?
The easy way to find out is by submitting a reconsideration request. The more difficult way is to audit your backlink profile and see if you built any shady links that violate their TOS. The EMD update might have caused your traffic to drop because I notice that you have an exact match domain.
Look into your analytics and see if you can identify the specific words that dropped in traffic.
-
RE: Getting 404 error when open the cache link of my site
Wait some time, it takes time for Google to cache your page after crawl. You should see it in a few weeks.
-
RE: Considering which agency to choose for a link building campaign is starting to seem like beating a dead horse.......
Yes keep shopping. You can get industry legends to work on your price for much less than that. I recommend asking for a proposal and a time frame from as many vendors as possible.
Let them know upfront that this is your budget and that you are not going a dime over. After you get 10 - 50 proposals, start picking out which ones are best for you and based on vendor fit. Don't go with people who are pushy or appear shady - you will regret it later.
Work with people with whom you have great chemistry with.
-
RE: Blog URL
There are plugins to get around this issue such as VaultPress. No installation is 100% hack-proof - I'd ask your designers for instances where website.com/blog got hacked - because if your website is very popular and you are concerned about hacking, you should go with a more secure CMS like Drupal or an enterprise content management system.
Let me know if this helps.
-
RE: Friendly URLs
We prefer to have it in the first link. You want to avoid redirects as much as possible. I think it was MC who mentioned that you permanently lose 1% of your traffic every time there's a redirect but it works both ways for the crawler.
-
RE: How can i locate the links on my site that are causing 404 errors?
Good to see you here, we've met in person at Lauren's bday several years ago in SF. Do you work at SEOmoz fulltime now?
-
RE: How to search for authorative Links?
If you can tell me your niche or post your website URL, I can construct some linkbuilding footprints for you.
-
RE: What do you use for site audit
I use the following tools:
- Xenu - identifies broken links
- GSite Enterprise Crawler - identifies on page issues
- Google Cache, Google Webmaster Tools - finds crawling issues
- Scritch - finds server/platform type
- Ahrefs, Majestic, OSE - for link diagnostics
- SEO Book Bulk Server Header Tool
-
RE: Link From Wikipedia Worthwhile?
No problem JP. Glad we could help out. Feel free to reach out to us if you have specific questions on link building. We've trained many folks and are happy to help out.
-
RE: Are you an in-house SEO or an Agency/freelancer SEO ?
Agency SEO but have worn all hats.
-
RE: How can i locate the links on my site that are causing 404 errors?
Use Xenu or Screaming Frog to find the pages that have broken links on your website. Just download and run and the end, it'll show you where they are located.
-
RE: Switching from a .org to .io (301 domain redirect)
I haven't seen a .io in the English SERPs as well, why not register a .io and see if you can get it to rank. The test should take less than a month and if it ranks then you can consider redirecting it to your .io domain.
-
RE: Switching from a .org to .io (301 domain redirect)
If you focus on brand building, then yes, you'll offset any losses. The EMD will pick out poor quality EMDs without the requisite quality content.
Also, the loss from the 301 is not a short term loss, it's a permanent loss.
-
RE: Switching from a .org to .io (301 domain redirect)
I would recommend leaving it as it is because despite the general consensus about 301 redirects transferring all link equity and traffic, the truth of the matter is that you will experience a small permanent loss in traffic every time a 301 redirect is utilized.
I would simply not take that risk as there are many things that can go wrong.
-
RE: Clients Slow to Publish Content
I prefer to stay away from ranks unless its a document that says we did "x" and as a result our ranks improved by "z positions" and "traffic grew by o%" and "sales or leads increased by n%."
Think of it as a 1, 2, 3 or 4 process flow chart with the money on the end result.
I include annotations on algorithmic updates, new/updated content uploads and outbound marketing activities. Basically anything that can create a social or normal link to your website. If the activity creates links then it should be annotated on the graph. This includes web development changes to your website.
It's difficult to communicate results when one tries to show what their plan was or is. I stay away from things like that because the purpose of that graph is to communicate results.
The line graph should be simple, visual and full of impact.
See if you can get one or two high quality articles out the gate. If you can measure the process like I mentioned above then show the resulting positional movement on the SERP and conversions. See if you can add $dollar figures to your numbers. Try to educate the client and show that 'x articles' can result in 'y leads' or 'z dollars.' Then probably put together a document that says if we produce this many articles then this is how much we can get in sales.
Make the business case. It's always a constant battle. Sometimes I like to do the reverse, especially when the client is so slow moving that I'm either losing faith in myself or getting downright frustrated. For these type of clients, I put together a business case that shows how much money they are losing and have lost since I've started working for them because they haven't moved fast enough.
Simply calculate the number of articles that produce x amount in sales for a single month. Multiply it by 12 or how many months you've been employed by them or say something like - "You've been leaving xxxxxxx dollars on the table every year because of x, y and z."
Try this tactic only if your numbers and calculations are solid and if your numbers are substantial and you will usually get a good reaction out of them. I hope this helps.
-
RE: Link From Wikipedia Worthwhile?
There's a trick to getting your Wikipedia articles or contributions posted. Wikipedia is like a credit card, if you have good credit then you can do whatever you want but if you have bad or no credit then you can't do what you want.
The other thing is that Wikipedia functions like a virtual world with actual territories over certain pieces of content. These territories or content niches are controlled by an individual or two so when a newcomer comes and starts making content contributions, its like declaring war on someone who invaded your territory.
If you want to get our contributions accepted in Wikipedia then you have to start small in that niche. Start by making simple edits to build up your influence, perhaps 1 - 3 per day by pruning dead links or updating link rot.
Then in a couple of weeks, move on to minor article submissions such as if you work for an automotive brand then you could update the stats on a new model or put a mention about a new plant species in.
By now, your user permissions would have been upgraded twice from a nonregistered user to a new then auto-confirmed user. Try to get more permissions added to your account, perhaps start reviewing bad edits and start flagging them. Try to show stewardship over a period of a few months to a several years on the content niche that you'd like to own.
After six months to one year, submit your contribution and see if it gets denied. It won't be denied now because by now you be considered an active member of the community who's contributed regularly by doing your part (click on any user in Wikipedia and you'll see their history of edits, awards, etc).
-
RE: Clients Slow to Publish Content
I use a twist on George's technique. I create use a project management tool where I assign the document to the initial person and upon approval ask them to assign it to the next individual in the chain of command. In creating such a system, you can focus on writing many documents and have them reassigned back to you as they get completed.
The other option is to spend time educating your client. We have a document here that I can't share. It's basically a time line with algorithmic changes and content calendar uploads that are annotated onto an organic traffic graph that's located on a collaborative environment like Google Docs.
Every time we have a win (traffic uptick), I share it with all involved parties. If the traffic declines, I report on it during the client presentation. But overall, I make sure that all stakeholders have access to that document and ask them to bookmark it and I reference it each time people question me to the point that individuals will go to that document when they have question.
As you do this, you will have people coming to you and asking all the whys and hows of the organic traffic flow. If you patiently answer their questions and evangelize SEO, then over time, you will create a culture where people start asking questions about why the traffic isn't moving up and how it's growing quickly and this conversation will eventually flow back to the senior management.
If you are consistent in your efforts, eventually you will gain access to an executive level champion like a CMO or a VP of Marketing. When you gain access to this individual, its very important to ask that individual how you can expedite the process mentioned above. If you can provide a strong business case then that executive level buy in will trump all your problems and things will become tremendously easier for you.
-
RE: Has anyone used or have the lowdown on LeadLander?
I've used LeadLander before. It's a legitimate business but nothing that you can't replicate if you know how. It basically uses a user's geographic information to pinpoint the user on a map then it matches that data against a list of businesses in that area.
When it finds a match, it hooks up into a database of contacts like Hoover's or a proprietary source to give you a list of contacts that may have browsed your website. It's up to you to cold call the business to find your decision maker.
It works poorly for medium sized and enterprise businesses because there are 100s and 1000s of contacts for these businesses. However, if your demographic is a mom and pop store or a self employed person then it makes better sense to use them to generate leads.
-
RE: Link From Wikipedia Worthwhile?
It's worth it because if you manage to write a page of copy on Wikipedia and your citations point back towards your website. People who research the Wikipedia article may jump to your website via the citation and reference your article instead.
Basically write a fairly comprehensive article but leave out some key elements, all of which are found on the secondary information source that is owned by you.
The key is to think long term. Wikipedia typically ranks for competitive queries so you may not get your links in the beginning but you have a chance to build them over time.
One of my mentors taught me that it's not so much as building a single link but rather how you can build system that will build links for you while you sleep. If you seed enough systems around on the internet, one or two links here and there won't matter much but over the course of your career, you will find that you've racked up a lot of links and continue to do so.
-
RE: Internal Site Search Analysis
I recommend reading through the entire spreadsheet from start to end and color coding every different intent keyphrase that you come across. Any keyphrases which have identical or very close intent matches should be color coded together.
I do this process manually then I write down every intent variation that I came across. It usually ranges from 10 - 30 different intent buckets. Once I have the pieces of the puzzle, I start piecing together the search funnel for the business.
Who are your known demographics (the personas that your marketing team talks about or your target audience)? Are there intent buckets that do not map to these personas? If yes, then you may be attracting a demographic that is unaccounted for. See if you can figure out who this demographic is.
Is it a worthwhile demographic? Pull out your conversion data. See if you can use the master list of intent buckets to break down the search funnel into a flow chart for each demographic by using the keyphrases mentioned in the first paragraph of this response.
I'm assuming that you have some kind of an employment website like Indeed or Monster.
There are different kinds of personas that I can think about off the top of my head. For example:
1. Employed
2. Unemployed
3. College Student
4. Remote Worker
On 4 different documents, you'd write down the keyphrase intent buckets that you know for a fact are associated with each persona mentioned above. You'll want to use conversion data or pages on your website that are built for specific target audience groups to help you make this decision. For example, if you had a page targeted at a college student, then you could analyze the keyphrases that drive traffic to that page to determine whether your intent bucket is grouped correctly with your correct demographic. You could go one step further by looking at the keyphrases that possibly convert into an email subscriber whom you know is a college student with 100% certainty.
Once you have a bunch of intent keyphrases mapped to each demographic, then you would try to break down the steps. Put yourselves in the shoes of a single demographic and make a decision tree. Write down all the possibilities that one would take online in a sequential set of steps and see if you can map your keyphrases to that list.
There's more but this exercise will get you started and help you figure out the different target audiences that come to your website and highlight their search behavior from keyphrase to keyphrase. It will show you the best points to attack them from an online marketing perspective by tailoring your messaging to distinct steps in the funnel.
Please let me know if this was helpful.
-
RE: Googlebot on paywall made with cookies and local storage
To make sure that I'm getting your question correct. You want Google to crawl and index all your content but you want visitors to use an open paywall that shows 5 free articles then resorts to a paywall.
Yes, it would be treated as cloaking but you have a legitimate reason for doing so and intent matters a great deal. You could check for a search engine user-agent string such as "Googlebot" and then serve the full content. This would ensure that your content is still crawled and indexed.
The only downside is any tech savvy individual can spoof the server header by setting their user-agent to "Googlebot" and bypass your paywall.
-
Avoiding duplicate content with national e-commerce products and localized vendors
Hello 'mozzers!
For our example purposes, let's say we have a national cog reseller, www.cogexample.com, focusing on B2C cog sales. The website's SEO efforts revolve around keywords with high search volumes -- no long tail keywords here!
CogExample.com sells over 35,000 different varieties of cogs online, broken into search engine friendly categories and using both HTML and Meta pagination techniques to ensure adequate deep-linking and indexing of their individual product pages.
With their recent fiscal success, CogExample.com has signed 2,500 retailers across the United States to re-sell their cogs.
CogExample.com's primary objective is B2C online sales for their highly-sought search terms, ie "green cogs". However, CogExample.com also wants their retailers to show up for local/geo search; ie "seattle green cogs".
The geo/location-based retailer's web-content will be delivered from the same database as the primary online store, and thus is very likely to cause duplicate content issues.
Questions
1. If the canonical meta tag is used to point the geo-based product to the online primary product, the geo-based product will likely be placed in the supplementary indexed. Is this correct?
2. Given the massive product database (35,000) and retailers (2,500) it is not feasible to re-write 87,500,000 pages of content to sate unique content needs. Is there any way to prevent the duplicate content penalty?
3. Google product feeds will be used to localize content and feed Google's product search. Is this "enough" to garnish sizable amounts of traffic and/or retain SERP ranks?
-
RE: Tips for Link Building for Mobile Sites
Hi Mark,
what would the purpose of those links be? As far as I know, mobile results are not that different from desktop results, except for the local results when a users allows for geo-location in his/her devices. In the near future Google might roll our a new form of SERPs that's specifically foo mobiles users, but in the past, mobile results have not been influenced by inbound links.
The only recommendation at this time is that you get those links from mobile website directories and similar sources.