Questions created by spyke01
-
Dropping Lower and Lower
So, a little history here, my site is 10 years old, each year I do a theme change to keep things fresh. Over time the site has gone from static html/php to wordpress optimized with yoast seo. I target the area of college station texas mainly but have had clients find me from california to australia. My site has a DA of 26 and a PA of 36 for the homepage and the internal pages range pretty wide.i've tried to go after local keywords since I know that I'm still optimizing for the base keywords even if I can't rank high for them. I have trimmed down some content over the years and used 301 redirects where needed. I try to keep my landing pages above 800 words and for blogs I try to go for 2000+ words. I have my nap with microdata on the site and this matches my info on google places/google+, facebook, and several other sites including the moz local sites. I do not list a physical address but have my business listed as a service that goes to the client. I mainly offer web design services so my keywords are items such as "web design college station tx". For a long time my site was typically the #1 to #3 spot. This was before the personalized results and when that rolled out I was typically still on the first page when using a private window or a remote checking site. My traffic has been pretty bad, in the early days it was typically around 75-100 a day but not I'm lucky to get between 7-12 visits a day. most of these tend to be for software I sell and not for my design services. I have several social signals on my pages but I don't have a consistent social media or blog posting schedule. I've never been victim to a manual penalty and I keep tabs on my backlinks and my competitors but I'm at a loss here as to how to beat out my competition. I'm considering creating pages for each local market and doing original content for each but I've never seen much result from this in the past. I'm also considering doing facebook ads as well. Any ideas on this?
Local Website Optimization | | spyke011 -
Best Way to Filter Backlinks
When analyzing backlinks and trying to get the same one for another site there are a ton of backlinks to go through. I know that if the DA of the link is then pages on the site might be a good choice like adding an article or something of the sort to the site but as far a the same page goes you can typically only do this with a comment on the page. My question is, given a huge list of backlinks from multiple sites, is there an easy way to analyze the links and determine which ones I can copy without manually checking hundreds of links?
Link Building | | spyke010 -
Workflow Question
Its been a long time since I've used Moz and things have changed considerably from them so I want to make sure I'm on the right track here. I typically only work on the on-page SEO and then comparing backlinks vs competitors. I typically have someone else handle new backlinks as they seem to be almost impossible to get these days. Typically I go with this Workflow (simplified version): Research - KeywordPlanner tool, keywordtool, ubersuggest, etc then trim the list and use Rank Tracker to look at actual keyword difficulty of top 10 listings or a manual check if its a small list. On Site Analysis - Use Website Auditor to get full view of on-site issues and then solve them by starting with broken links then HTML, then meta, etc. On Page Analysis - Looking at the Page Audit from Website Auditor to see areas where a client is failing to optimize the page for a specific keyword and then fix it. Gather Backlinks - Use SEO Spyglass to gather the links of competitors, analyze them, then check each one manually to determine if I can get the same link if its a high quality one. With Moz, would this look something like this: Research - KeywordPlanner tool, keywordtool, ubersuggest, etc then trim the list and use Keyword Difficulty to look at actual keyword difficulty of top 10 listings or a manual check if its a small list. On Site Analysis - Use Moz Analytics Campaigns to get full view of on-site issues and then solve them by starting with broken links then HTML, then meta, etc. On Page Analysis - Looking at On-Page Grader to see areas where a client is failing to optimize the page for a specific keyword and then fix it. Gather Backlinks - Use Open Site Explorer to gather the links of competitors, analyze them, then check each one manually to determine if I can get the same link if its a high quality one. If so then on #1 should I be doing the basic or a full each time? Is there something I am missing on this list?
Getting Started | | spyke010