Should digital marketing agencies treat SEO differently when it comes to homepage content?
-
When I review competitor digital agency sites, they seem to have very little homepage content. But how would this be beneficial in gaining a higher SERP rank?
-
I agree with both posters before me. I find many times digital agencies use their homepage to grab attention, and to build a strong sense of branding, but typically build strong internal pages loaded with content to help drive organic search results.
-
It's not - the agencies most likely rely on brand appearance and driving traffic through other means (blog content, ppc traffic) for conversions.
Digital marketing agencies compete in a tough organic niche. It's a growing business niche and has very low barrier to entry. Therefore, websites need to find other ways to compete rather than targeting the broadest terms and ranking their home page organically for them. Sometimes, leaving information out and having a luxurious or successful appearance is enough to convert traffic.
-
I can't see it's a sensible strategy, really... style over [SEO] substance, perhaps. Some larger brands have told me they understand there's an issue with their lack of words (on homepages and elsewhere), yet argue their pages can "take it" due to strong backlink profiles, etc. - and that the design impact matters more than words - I spend much of my life arguing for more words (or at least some!).
This is useful re: how many words/page: http://www.hobo-web.co.uk/how-much-page-text/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Company wants to rebuild site
Hello Community, I am a designer and web developer and I mostly work with squarespace. Squarespace has SEO best practices built into the platform, as well as developer modes for inserting custom code when necessary. I recently built a beautiful website for a Hail Repair Company and referred them to several companies to help them with SEO and paid search. Several of these companies have told this client that in order to do any kind of SEO, they'll need to completely rebuild the site. I've seen some of the sites these companies have built, and they are tacky, over crowded and hard to use. My client is now thinking they need to have their site rebuilt. Is there any merit to this idea? Or are these companies just using the knowledge gap to swindle people into buying more services? The current site is : https://www.denverautohailspecialists.com/ Any advice would be appreciated.
Local Website Optimization | | arzawacki2 -
What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter? The Local business properties section of developers.google.com says to: Define each local business location as a LocalBusiness type. Use the most specific LocalBusiness sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on. Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype). I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all? I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course). I do feel this is important and that a better workflow could be the answer. If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
Local Website Optimization | | bulletproofsearch0 -
Local SEO - Multiple stores on same URL
Hello guys, I'm working on a plan of local SEO for a client that is managing over 50 local stores. At the moment all the stores are sharing the same URL address and wanted to ask if it s better to build unique pages for each of the stores or if it's fine to go with all of them on the same URL. What do you think? What's the best way and why? Thank you in advance.
Local Website Optimization | | Noriel0 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
Local SEO question
Hi I was wondering is there any specific rules for Local SEO for a service company which provides a service in a variety of cities but only has one physical location. For example is it ok to target the other cities in Title Tags or would this be frowned upon? Regards
Local Website Optimization | | TheZenAgency0 -
How can I rank my .co.uk using content on my .com?
Hi, We currently have a .com site ranking second for our brand term in the .co.uk SERP. This is mainly because we don't own the exact match brand term which comes from not having a clue what we were doing when we set up the company. Would it be possible to out rank this term considering we the weighing that google puts towards exact matches in the URL? N.B - There are a few updates we could do to the homepage to make the on-page optimisation better and we have not actively done any link building yet which will obviously help. competitor SERP rank 1 - MOZ PA38 DA26 Our Site SERP rank 2 - MOZ PA43 DA32 Thanks Ben
Local Website Optimization | | benjmoz0 -
Had SEO Firm tell me to Start Over - pros and cons help please
Hi So I have quotes of 1250 to 2500 a month to run my website, seo wise. What I am told is they will do all facebook postings, 4 blog posts each month, some citations, and site optimization. Those amounts due seem like a lot. Yet I was last to start all over. Basically I was told that because of some bad backlinks, which only a few remain, that you can never recover from an algorithm penalty. And with a Disavow, its like telling Google - penalize me please So the plan was this: $3000 for a new site, and new domain, and then it has no penalties, and I will be ranking in no time. The problem is I am branded. My domain and business name is Bernese Of The Rockies. People know us and we are very respected. So if we create a new site like example.com, I do not want to mislead people. Or if there is a penalty for say a landing page or site, where I am sending people to my main site for more info type of thing. Just looking for your input if this is a common issue, where if you have a non manual, but algo penalty that you must restart? Thank you so much for your thoughts and suggestions.
Local Website Optimization | | Berner0