Daytona Beach Web Design vs. Daytona Web Design: What's Best?
-
Three months ago we had our team create local pages for some of the services we render -- _i.e., _web design. As we reviewed the pages, they created two pages with similar content; one with URL:
- /daytona-beach-web-design/
- & /daytona-web-design/
We knew we had to kill one of them to avoid duplicate content. Here is where the hard decision came and hence the question.
We though about keeping the '/daytona-beach-web-design/ ' URL but for some reason, Google had already crawled the shorter version of the URL '/daytona-web-design/'
So we ended up deleting the long tail URL and kept Daytona Web Design instead.
Which one would you keep and have you experienced similar issues?
-
The decision to kill [/daytona-beach-web-design/] was rather simple. Even though both pages were created at the same time. After looking at Google Search Console, only one URL got indexed. So we deleted one and expanded our content on the the other.
-
It was wise to kill one of those pages. If I were in this position, I would use as much data as possible to inform the decision. I would look at:
Google Search Console for impressions, average position, CTR
Google Analytics: entrances, page value, bounces
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Current advice or best practice for personalization by geolocation?
What is the current advice for displaying content based on a user's geolocation? On the one hand, I know the rule of thumb is that you are not supposed to treat googlebot any different than any other user to your site and shouldn't show different content than what you would show a regular user, however on the other hand, if we personalize the content based on the geography, it means that the content that is indexed would be specific to Mt. View, CA in Google's index, correct? I know I heard years ago that the best practice was to use javascript to personalize the content client side, and block the js with robots.txt so that google indexes a default page and not a geo-specific page. Any insights or advice appreciated.
Local Website Optimization | | IrvCo_Interactive0 -
What is the best way to differentiate and optimize two similar websites's SEO?
What is the best way to differentiate and optimize two similar websites's SEO, having in mind that they do not produce content?
Local Website Optimization | | EmmaGeorge0 -
Best SEO Option for Multi-site Set-up
Hi Guys, We have a Business to Business Software Website. We are Global business but mainly operate in Ireland, UK and USA. I would like your input on best practice for domain set-up for best SEO results in local markets. Currently we have: example.com (no market specified) and now we are creating: example.com/ie (Ireland) example.com/uk (united kingdom) example.com/us (united states) My question is mainly based on the example.com/us website - should we create example.com/us for the US market OR just use example.com for the US the market? If the decision is example.com/us should we build links to the directory or the main .com website. To summarize there is two questions: 1. Advise on domain set-up 2. Which site to build links to if example.com/us is the decision. Thank you in advance, Glen.
Local Website Optimization | | DigitalCRO0 -
Call Tracking Best Practises for General SEO
Hey folks, So I'm aware of the importance of consistent citations, and the mayhem call tracking numbers have been known to cause in regards to that in that past. So just wanted some up to date clarification on these two things: Local SEO isn't strictly speaking a big deal for us as we supply a software and as such are technically global. I'm presuming consistent citations are still worth aiming for though, and will help increase general authority as well? Let me know if I'm totally wrong about that! What's the best practise set up for call tracking, given that your main NAP number you'd obviously want hardcoded somewhere, alongside showing your dynamic numbers to relevant visitors. Apologies for any ignorance, as always any help and advice is muchos appreciato.
Local Website Optimization | | Zoope1 -
SEO geolocation vs subdirectories vs local search vs traffic
My dear community and friends of MOZ, today I have a very interesting question to you all. Although I´ve got my opinion, and Im sure many of you will think the same way, I want to share the following dilemma with you. I have just joined a company as Online Marketing Manager and I have to quickly take a decision about site structure. The site of the company has just applied a big structure change. They used to have their information divided by country (each country one subdirectory) www.site.com/ar/news www.site.com/us/news . They have just changed this and erased the country subdirectory and started using geolocation. So if we go to www.site.com/news although the content is going to be the same for each country ( it’s a Latinamerican site, all the countries speak the same language except Brazil) the navigation links are going to drive you to different pages according to the country where you are located. They believe that having less subdirectories PA or PR is going to be higher for each page due to less linkjuice leaking. My guess is that if you want to have an important organic traffic presence you should A) get a TLD for the country you want to targe… if not B)have a subdirectory or subdomain for each country in your site. I don’t know what local sign could be a page giving to google if the URL and html doesn’t change between countries- We can not use schemas or rich formats neither…So, again, I would suggest to go back to the previous structure. On the other hand…I ve been taking a look to sensacine.com and although their site is pointing only to Spain | |
Local Website Optimization | | facupp1
| | |
| | |
| | |
| | |
| | |
| | |
| | | They have very good rankings for big volume keywords in all latinamerica, so I just want to quantify this change, since I will be sending to the designers and developers a lot of work1 -
.com vs .com/language ?
Hello Moooooooooz ! We're currently working on a new website http://www.globalmetal.fr/ which deep SEO issues. The problematic is as always in this case: 1 company + different subsidiaries + different markets + different languages The companies is handling different domains: http://www.globalmetal.fr/
Local Website Optimization | | JoomGeek
www.globalmetalbroker.ch
www.globalmetalbroker.com
and so on. Until recently I was totally convinced (there is no magic solution I know) that it was better for a SME to focus on 1 domain (.com) and get the other websites per language .com/fr .com/es etc. But in their case their TLD is pretty new: www.globalmetalbroker.com (DA 1) vs globalmetal.fr (DA 15) So I'm wondering: 1- Does Google know understand that globalmetal.fr is the french website of globalmetalbroker.com (maybe via webmaster tool) ?
2- Does it make senss to move all the (new) language websites into .com/[folders] and once the .com DA is doing better redirecting the .fr to.com/fr ?
3- Is it better to focus on .com .fr (but french speakers are not just in france) .ru and so on or to keep the .com/[languages] Hope someone got the same issue recently 😛0 -
Best way to remove spammy landing pages?
Hey Mozzers, We recently took over a website for a new client of ours and discovered that their previous webmaster had been using a WordPress plugin to generate 5,000+ mostly duplicated local landing pages. The pages are set up more or less as "Best (service) provided in (city)" I checked Google Webmaster Tools and it looks like Google is ignoring most of these spammy pages already (about 30 pages out of nearly 6,000 are indexed), but it's not reporting any manual webspam actions. Should we just delete the landing pages all at once or phase them out a few (hundred) at a time? Even though the landing pages are mostly garbage, I worry that lopping off over 95% of a site's pages in one fell swoop could have other significant consequences. Thanks!
Local Website Optimization | | BrianAlpert780 -
What's the best way to add phrase keywords to the URL?
Hi, Our keywords are all our service + a list of towns (for example, "carpet cleaning St. Louis"). The issue I'm having is that one particular site could be targeting "carpet cleaning St. Louis", "carpet cleaning Manchester", "carpet cleaning Ballwin", "carpet cleaning Kirkwood", etc. etc. etc... up to maybe 15 different towns. Is there a way to effectively add these keywords into the URL without making it look spammy? I'm having the same issue with adding the exact keywords to the page title, img alt tag, etc. Thanks for any advice/input!
Local Website Optimization | | nataliefwc0