General SEO Help
-
Hi Everyone,
**Website: **www.helppestcontrol.com
I've been working on a wordpress based website for the past few months now. This is a new website that we designed for an existing company that decided to rebrand.
The previous website had little to no traffic.. so we've basically started for scratch. I've followed SEO guides and have completed many of the basics. We started using MOZ just under a month ago and have made a ton of changes based upon those suggestions. With all of this being said, we have seen some slight improvements in traffic, but nothing truly noticeable. In fact, 90% of our traffic is coming from a Facebook PPC campaign.
I think the main struggle is that the company has such a wide operating based (a ton of very small towns and cities). We created an optimize page for each one (same content, just switched out the keywords).. in hopes of driving traffic. Is this the correct approach? Or should be optimize for general terms such as "Bed Bug Removal" versus "Bed Bug Removal Barrie"?
I was hoping that the community could take a look at the website (maybe run it through a few tests) and give me some more suggestions. I would really appreciate any feedback.
Thank you!
-
I have always included targeted keywords such as 'bed bugs barrie' and it has worked well.
-
Thank you for the suggestions. We're definitely going to make some changes.
One last question, when we write targeted landing pages with original content (for example, say it's about the city of Barire), should that landing page be optimized for generic keywords such as 'bed bugs' 'spider spraying'... or should we still include targeted keywords such as 'bed bugs barrie' 'spider spraying barrie'?
Thank you again!
-
I also agree with Dave. My company works with dozens of local businesses, and creating targeted landing pages is part of the strategy for most of them. The difference between original content and keyword swapped content is often the difference between first page rankings, and 3rd+ page rankings. Take the time to write in depth content for each of these pages. You will want it long enough to rank for your main terms, and any variations that may exist.
-
Hi Tim,
I'd suggest more of a variation in content than just swapping out keywords for location pages. And definitely pad out the main pages and posts with more content as they're a little on the short side.
Cheers,
Dave
-
Hi
Welcome to the community,
First of all it looks like you have very few links pointing to your website so this isn't going to help rankings. I would look at doing some outreach especially on some of your blog articles as this information could be very useful.
Look at other local sites that you think the content would be useful for the web site users and contact the site owner and see if they would like to use part / all of the article. If you are going to let them use all identical content - watch this whiteboard Friday on the best practices to follow: http://moz.com/blog/syndicating-content-whiteboard-friday
The next area I would look at, is the length of the product you offer. Can you not increase the words on here, maybe by putting some tips to look out for if you suspect you have bugs etc. There was a recent whiteboard friday which suggest you need to have around 1,500 words per page.
I am sorry to say, but your 'where we serve' pages look to spammy and look like they we're written purely for SEO purposes - so if I am thinking that then so will Google. All the content is identical on each page, apart from you change the town / place name.
Could you not simply have one page explaining all the services you offer and a list of the towns. (You will have to excuse me here as I don't know US places that well), but could you not group them into larger areas and reduce the number of pages and write unique content for each page, so more places on one page but geographically towns together, maybe by state etc. Also could you include Google maps on these pages - make them more interactive - this is a must watch video http://moz.com/blog/panda-optimization-whiteboard-friday
So basically I would:
- Work on getting more links
- Write longer blog articles
- Organise your areas where we work sections
Hope this is all useful for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
Research on industries that are most competitive for SEO?
I am trying to see if there is a reputable / research-backed source that can show which industries are most competitive for search engine optimization. In particularly, I'd be interested in reports / research related to the residential real estate industry, which I believe based on anecdotal experience to be extremely competitive.
Local Website Optimization | | Kevin_P3 -
Question About Local SEO
Hey all, If a business operates in one city but works with associated organizations across multiple regions how would this impact a local SEO campaign? For example, a transportation company is located in Texas but services the Northwest and New England by outsourcing to smaller transportation companies in each of those regions. Would it be wise to create pages for each region they service on their website and then break that down in further into specific cities? Also, would it be worth targeting local search terms even though specific cities are serviced by the associated organizations and not the parent company itself? Thanks in advance, Andrew
Local Website Optimization | | mostcg0 -
My pages are absolutely plummeting. HELP!
Hi all, Several of my pages have absolutely tanked in the past fortnight, and I've no idea why. One of them, according to Moz, has a Page Optimisation Score of 96, and it's dropped from 10th to 20th. Our DA is lower than our competitors, but still, that's a substantial drop. Sadly, this has been replicated across the site. Any suggestions? Cheers, Rhys
Local Website Optimization | | SwanseaMedicine0 -
International SEO - How to rank similar keys for differents countries
Hello MOZ friends.
Local Website Optimization | | NachoRetta
I work in an digital marketing agency in Argentina and since we have a lot of traffic from other Spanish-speaking countries like Mexico and Spain, we want to rank specific keywords for these countries.
We were thinking of putting new versions of the homepage in subfolders, for example /es/ for Spain, /mx/ to Mexico, etc. In these new subfolders we would place a very similar version of the homepage with a few minor modifications to work specific keywords in each country. For example, in Spain it is more searched "marketing online", and "marketing digital" is more used in Mexico and Argentina.
I have understood that to implement this we would be to place a label hrflang on the homepage directing visitors and crawlers to the correct version of each country. Is it ok?
Another concern is, whether they are very similar pages, Google does not take it as duplicate content ..
I read this:
https://moz.com/blog/the-international-seo-checklist
And i am not completely sure about using subfolders for each country, but i dont know how to position diferents keywords for diferent countries.
Regards,
Juan Ignacio Retta0 -
Local SEO - Adding the location to the URL
Hi there, My client has a product URL: www.company.com/product. They are only serving one state in the US. The existing URL is ranking in a position between 8-15 at the moment for local searches. Would it be interesting to add the location to the URL in order to get a higher position or is it dangerous as we have our rankings at the moment. Is it really giving you an advantage that is worth the risk? Thank you for your opinions!
Local Website Optimization | | WeAreDigital_BE
Sander0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
HELP, My site have more than 40k visits by day and the server is down, I do not want all this visits...
Hello... I have a website for a local spa in ecuador, this website have a blog with some tips about health... and suddenly one of the articles goes viral on south america profiels on FB and I am receiving 40k visits by day from other countries that are not interested to me because my site is for a local bussines in ecuador... I already block some countries by IP , but Im still receiving visits from other south america countries, for this reason My hosting server company put down my website and I can not put it back online beacuse this thousands of visits use more than the 25% of the CPU of the server and the hosting company put down my website again... I really need to know what to do, I do not want to pay for a expensive special server because all this visits from other countries are not interesting to me .and as I said before my bussines is local.
Local Website Optimization | | lans27872