-
Hi Lina,
Can't you just make a page for city guides and one for hotels and use those as head pages and put the rest under those?
So
head page: hotels
under that: battambang and under that one the classy hotels page
And for the guide something like:
head page: expat city's
under that: battanbang guide
Just an idea, good luck with redirecting!
Posts made by Bob_van_Biezen
-
RE: Need URL structure suggestions
-
RE: Need URL structure suggestions
Hi Lina500,
I think you’re right about changing this before moving on. In any website we develop website structure comes before content creation.
I understand your concern about keywords in the URL and the length of the URL. The current URL is pretty confusing for your visitors and is probably too long to read.
Gladly for you, Google doesn’t put too much weight on keywords in your URL and they are getting better and better at understanding what pages are about. A Word like accommodations will be known as a close variant of hotels for example. This way a descriptive URL is just fine.
In your situation I would find out what words are used most. For example:
Accommodations vs hotels
Island guides vs city guides
If the most searched keyword describes the page I would pick that one. If however the other keyword fits the content of that page best I would pick the other keyword.
Your own suggestion movetocambodia.com/expat-city-guides/battambang/accommodation/classy-hotel/ seems like a pretty good one to me.
I would only skip /expat-city-guides/ and make it movetocambodia.com/battambang/accommodation/classy-hotel/ since these are keywords that get used combined. People won’t search for city guide classy hotel for example.
In the ideal situation I would go for /expat city guides/battambang/ for the city guides and put your accommodation overview under /accommodations/battambang/classy-hotel/ (without /expat-city-guides/ before it.).
I hope this helps!
-
RE: Does sitemap auto generation help in SEO 2015
Hi Omverma,
I totally agree with Hectormainar, sitemaps aren’t a must but I would definitely recommend them! Any website with a few levels (webshops for example) will have large amounts of deeper pages that are hard to get by through the internal link structure and a sitemap definitely helps with that.
Besides that, if you upload your sitemap in Google webmaster Tools you can get a quick view of how many pages Google has indexed. For me this is a quick indication of any indexation problems might going on.
I hope this helps!
-
RE: Capitals & Lowercase Titles being seen as different in MOZ
Hi Dave,
Moz is actually pretty right about this. Those are duplicated versions of the same page and should have a canonical tag (or shouldn’t be generated in the first place).
You can find some more information about canonical tags here:
https://support.google.com/webmasters/answer/139066?hl=en
In your case I would check why these different URL’s are generated as well and make sure your internal links all point to one version. Screaming frog can help you map all your internal links and find any links to the duplicated versions of the pages.
I hope this helps!
-
RE: Local citations from business directories in other countries
Hi Patrick,
Thanks for your response!
I did read the resources on Moz and couldn’t find an answer to this particular question. Ignore the low quality sites and focus on the top 50 comes the closest to what I’m looking for.
Any thoughts about how Google handles citations of a Dutch business on big international business directories?
-
Local citations from business directories in other countries
Hi all,
I normally work for clients in my home county (The Netherlands) and with local citation building I focus on Dutch websites or well know .com websites in the Netherlands. My rule of thumb kinda was, if it’s not known in the Netherlands it isn’t worth getting mentioned there.
Since The Netherlands are pretty small and I think Google ain’t perfect I was wondering if it makes sense to list a Dutch business on any .com business listings that are internationally big, but aren’t well known in the Netherlands.
Two reasons that got me thinking this direction:
- A big well known Dutch company offers a service such as Moz local and did integrate their service with several international business listing websites that I never heard off, since these business directories focus themselves on other parts of the world.
- Google ain’t perfect and I think they got more budget to identify trustworthy business directories with an international focus or a focus on America then with a focus on The Netherlands.
So I’m wondering if it makes any sense to list a Dutch business on let’s say the top 20 international business directories (although these directories don’t have any brand recognition in The Netherlands).
-
RE: Gets traffic on both domain.dk and domain.dk/default.asp
Hi Kasper,
I think your own suggestion is just right. Put a 301 redirect on the /defailt.asp page and your linkjuice should be fine.
Two extra hints:
1. Make sure your internal links point to your normal homepage (not the /default.asp).
2. If you have any valuable backlinks pointing to /default.asp you can try to change (ask the webmasters) the backlinks to your normal homepage.
Both suggestions prevent your linkjuice from going through a 301 redirect. Although 301 redirects pass linkjuice there is a damping factor (0,85) in the original PageRank system which lowers your linkjuice a bit with every link / 301 redirect it passes through.
Good luck!
-
RE: How difficult is schema.org markup
Hi Alan,
I totally agree with you about the ease of editing a hand made website if they're build the right way (so the developer knows were to look).
My point is mainly about code quality, if it's all spaghetti I think the price quota can be higher then with a CMS system.Just wanted to clear that up.
-
RE: How difficult is schema.org markup
Hi Joshua,
I think it’s important to know what system you’re using. If it’s all custom made and the webdeveloper needs to “get into” your code it will probably cost a bit more then when you’re using generic systems like Magento or WooCommerce.
I’m not a developer, but my companion normally fixes these things within a few hours. A quote between 100- 200 sounds reasonable since it’s just adding pieces of code on the right spot.
-
RE: What day of the week should I set my campaigns up on?
Hi scienceisrad,
Great to see you investigate SEO on your own!
Personally I think there isn’t any specific “best day”. With all measurements it’s important to do it consistent on the same way to make sure you’re equation is right. I would compare this with tracking weight lost. It doesn’t really matter which day of the week you pick, just make sure it’s the same day , same time and uhmm before or after you take a crap.
In your case, I would just remove the one campaign and set it up on Monday again. Although rankings fluctuate a bit I think you just need a global picture of your competitors to make decisions about your own strategy (unless you’re doing some big SEO project with a whole team ofcourse).
I hope this helps!
PS. It’s also possible to make one campaign and add competitors so they do get monitored as well.
-
RE: Wordpress 503 errors
It's indeed a paid option. I think it’s the most important that the server is responding correctly at the moment and you find out what caused the errors. If you check any pages with a 503 status manually, do they respond?
A other option is to check the server logs if you can find out anything there or call your hosting provider and ask them why the server is giving 503 errors. If you didn’t cause the problems it’s important to know what did since they may come back and cause your website to stop working correctly again.
Webmaster tools will only give you an overview of what Google finds at the moment they visit your website. So it might not give you the most accurate answer to what caused the problem.
-
RE: Redirect HTTP to HTTPS
Hi Harry,
If you already did implant HTTPS on your website you should definitely start redirecting since you will otherwise have a duplicated version on your website.
But I suppose you’re still in to process of making that decision. In that case I would read this post on Moz which can give you a great overview of what HTTPS can and cannot do for your website (and your SEO).
http://moz.com/blog/seo-tips-https-ssl
SEO wise the main point is that this is just a small rankings factor which might not be worth the time and hassle. Personally I would implant it in a new website, but wouldn’t (solely based on the SEO benefits) change an existing website to HTTPS. In that situation the security, privacy and trust reasons should be important to you as well to justify the decision.
I hope this helps!
-
RE: Keyword density or No. of Time keyword used
Hi Jonathan,
I totally recognize your situation. Although I don’t really check the keyword amounts of the competition I do write articles and find myself using the keyword a LOT. This while I don’t really think about “using” it while I write the articles.
I these cases I CRL + F and search for the keyword and try to change a few sentences. I normally don’t put more than 5 minutes of work in it and I only change stuff if it’s still understandable for the reader. Unfortunate I don’t have case studies in were this method actually pays off, so would love to see some more about this topic as well!
-
RE: Wordpress 503 errors
Hi Happy SEO,
An 503 error is a server error which can indicate a server overload.
It could be possible that you created these errors because you are using screaming frog. This tool makes a lot of requests in a very short period. I would try to run the tool again with speed adjusted to 1 threads with 2 URL/s.
You can find the speed settings under configuration. If this doesn’t resolve the problem feel free to post a response.
PS. When I use the tool I only get a response from the homepage. This does indicate the server is blocking the access.
-
RE: 301 redirects for 3 top level domains using WP SEO Yoast
Great to see you found an answer Justin!
-
RE: Deleting Blog posts
Hi Justin,
I’m afraid I can’t help you with this one since it got a lot to do with how your website is structured and how you’re setting up your redirects at this moment. I would ask a programmer for advise to provide you with the best solution.
Of course I would love to help you with any more SEO questions!
Good luck!
-
RE: Deleting Blog posts
Hi Justin,
I just checked out your new design and it looks great!
About your question, if the 301 redirect is set up correct it will be processed first so every bot or visitor will get redirected and won’t see the old page and it can’t cause duplicated content. Take in consideration that keeping a page live will prohibit you from using this URL anywhere else on the website. If you wish to keep the pages in your backend through I would set the pages to concept.
Good luck!
-
RE: Deleting Blog posts
Hi Justin,
You're thinking in the right direction. You should indeed 301 redirect the old URL. This helps you transfer your linkjuice to the new page and it gives searches engines a clue about were the content went. This way they will drop the old listing faster and pick up the new URL earlier.
In case you’re not redirecting, 404 URL’s will drop from the SERP’s over time and the new ones will be picked up. A good sitemap and internal URL structure can help speed up the indexation.
Good luck, sounds like you’re making some real progress last few weeks!
-
RE: New gTLDs
Hi LupoNorth,
I didn't read anything about it but my guess is that the perception of the search engines on these new TLD’s will change over time based on how they get adopted. At this moment I would say a .com domain will rank better then .shop although the purpose of this TLD should be clear to humans.
I know search engines are struggling with TLD from other counties. So if I’m in the Netherlands (in fact, I’m) and I decide to use www.example.it because I got a IT business the domain won’t pop up in the Dutch search results (although the text is all Dutch). Something similar could happen with the new domains. If for example .shop gets adopted by a lot of web shops it could happen that a blog would score less with a .shop domain.
I hope this helps and I’m interested in how others feel about this subject.
-
RE: Is Syndicated (Duplicate) Content considered Fresh Content?
This was the part that triggered me:
"Google Fellow Amit Singhal explains that “Dif__ferent searches have different freshness needs.”
The implication is that Google measures all of your documents for freshness, then scores each page according to the type of search query."
-
RE: Is Syndicated (Duplicate) Content considered Fresh Content?
Thanks a lot! Kinda made me realize I really should read some more about this update. Might be off topic, but what's your view on freshness applied to **all **pages. In this Whiteboard Friday its stated it only impacts the terms you describe:
http://moz.com/blog/googles-freshness-update-whiteboard-friday
But in this blogpost of that time (before the sum up) it’s stated that it’s applied to all pages, but does affect search queries in different ways:
-
RE: Is Syndicated (Duplicate) Content considered Fresh Content?
Hi Alan,
Is there any source / own research that can back up this answer?
Would love to read more about this subject!
-
RE: Help in Internal Links
Hi Ravi,
Always use a Do follow attribute. Only in rare cases were you don't want pages indexed you could consider adding a nofollow tag but beside that, always use a Do follow.
Why: Do follow links pass PageRank and help your website score better in search engines. If you use a nofollow link your links won’t pass any PageRank and besides the homepage, your website won’t score with any pages.
There was a time people tried using nofollow tags to “sculp” there PageRank flow to help specific pages rank better. This practice doesn’t work since a nofollow tag won’t increase the PageRank flow to other Do follow links on your website.
Basically you can see a nofollow tag as a big PageRank eating black whole. You don’t want to throw your PageRank in it
I hope this helps!
-
RE: Home page cannibal
Hi Kathy,
You’re not the only one with this problem. There have been a lot of questions on Moz about why homepages or subpages score better on certain keywords. Here are some of the results (including people who face the opposite problem, subpages beating there homepage).
- http://moz.com/community/q/how-to-make-my-good-sub-page-rank-ahead-of-my-generic-home-page
- http://moz.com/community/q/wrong-page-ranking-for-keyword-should-i-move-the-better-content-over
- http://moz.com/community/q/why-is-my-homepage-ranking-below-a-subpage
- http://moz.com/community/q/homepage-ranks-worse-than-subpages
- http://moz.com/community/q/subpage-ranking-for-homepage-keyword
- http://moz.com/community/q/why-is-a-sub-page-ranking-over-home-page
Reading those will give you a better understanding of the factors involved in Google’s decision. Since Google tries to show the most relevant result it’s your job to hint them in the right direction. In your case I saw your category page is missing a page title which can help Google determine what the page is about. I would also suggest adding a bit more content on this page.
Your blog post is probably not picked because it’s harder to reach through the internal link structure and doesn’t carry as much link status as your homepage.
Last quick tip: you can check how Google prioritizes your website on a certain keyword by searching site:vanillaqueen.com "bulk vanilla beans" in the search bar. You can see here that you’re page category page is placed second.
I hope this helps!
-
RE: How to find keyword related questions to answer?
Hi Landon,
I would first try to figure out if there are any communities within your industry that have a questions / answers section. If so, it might be more beneficial to focus on one or two communities and make a name for yourself (which can result in business relations and multiple backlinks from other websites).
If there aren’t any or you’re not in it for the long run I would try:
- Search Google with adjusted filters, content from the last 24 hours for example. Something like “seo question” with a filter on the last 24 hours will give some results.
- Set up fresh web explorer or Google Alerts for frequently asked questions. Since it will be hard for a robot to give you only the questions about your topic. Most services just return very specific phrases like brand mentions.
I hope this helps!
-
RE: Focus Keyword
Hi Via Trading,
Difficult questions since you’re asking about a border situation on one of all the different ranking factors out there. About a year ago I found myself trying to dive this deep into specific ranking factors as well and I found a lot of conflicts with “best practices” for ranking factors and what would make the most sense for users. Beside these conflicts I found there is a lot of exact (percentages for example) information not available in the SEO community. Looking at your questions, I think nobody is willing to invest his time in setting up large scale tests to find out how these two are correlating. Not because the question isn’t legit, but because there are bigger ranking factors that we don’t fully understand.
When you find yourself diving this deep into a ranking factor I advise you to stick to what does make the most sense for your visitors. This way you don’t do any over optimization and you can quickly switch investing your time in more important aspects of your website (content, promotion and just plain making your product awesome).
To help you further on your question I will try to give you some more information (and my view on this topic) about your question.
1. If it makes more sense to use “liquidation channels” I would go for that one since your page will probably be about multiple channels. If this is the case then just switching your URL won’t make your page content go about one channel, it would still have content about multiple channels on it. Search Engines do a great job at recognizing this so if you want to rank for ”liquidation channel” you should probably change the context of your webpage as well to become the perfect match for search engines. That said, I think you a web page about multiple liquidation channels can be very helpful when I look for one. In my experience you can rank for both keywords with either of the URL’s. Here is a screenshot I just took about a situation like yours I have on our own website.
http://i.imgur.com/pkfdAz3.png
With the keywords “tattoo shop” and “tattoo shops” we found it harder to rank on the singular then on the plural but with both keywords we managed to get a #1 position (we didn’t specifically linkbuild on any of those terms). Just like you I through changing my URL would increase my chances to rank for the singular (since this keyword has the most traffic). So when we expanded our business to a new industry (driving schools / rijschool in dutch). We used the singular in the URL but still found us ranking better on the plural. My bet is that this is because our page is about multiple driving schools and not just one driving school. Besides that I think our backlinks help a little to push this result as well so it’s not 100 exact science.
Backlinks are an important factor to consider as well, if you change your page about a plural into a singular but your content is still about a plural you will find other websites using anchor texts with the plural instead of the singular. So in the end, you will still have a harder time with the singular then with the plural.
2. The quick answer (this comment is already getting way out of hand ), it won’t be the full 100% per cent since a general term like that often triggers some results as Wikipedia or other descriptive sources. You will definitely have a correlation with this term but don’t expect too much from it. Your best change to also score on this term with your homepage is to make sure your whole domain is about this subject. That would make you a better fit for a general term like this. In my opinion you should focus your homepage on a more specific term that does the best job to describe your website or main product or service.
I hope this helps!
-
RE: Pornhub-forum spam
We saw a lot of this porn spam referrals lasts weeks as well with a brand new website. I would just filter it out and don't worry about it.
-
RE: Spam score
Hi Shamshuddin,
Like Dennis said, spam score shows how many of the 17 possible spam flags are triggered. These flags are triggered on link metrics or on-page metrics. Based on the amount of flags triggered Moz found a correlation between those factors and the amount of websites that have been penalized.
Here is the correlation between the amount of triggered flags and the amount of penalized websites:
- 0 flags = 0,5% of found websites were penalized
- 1 flag = 1,0% of found websites were penalized
- 2 flags = 2,0% of found websites were penalized
- 3 flags = 4,2% of found websites were penalized
- 4 flags = 7,5% of found websites were penalized
- 5 flags = 11,4% of found websites were penalized
- 6 flags = 16,2% of found websites were penalized
- 7 flags = 30,6% of found websites were penalized
- 8 flags = 56,8% of found websites were penalized
- 9 flags = 71,9% of found websites were penalized
- 10 flags = 77,3% of found websites were penalized
- 11 flags = 87,3% of found websites were penalized
- 12 flags = 93,4% of found websites were penalized
- 13 flags = 98% of found websites were penalized
- 14 flags = 100% of found websites were penalized
- 15 flags = 100% of found websites were penalized
- 16 flags = 100% of found websites were penalized
- 17 flags = 100% of found websites were penalized
In your case there is no need to worry. If you wonder which spam flag you triggered you can check http://bit.ly/1P5WSeG and then click on any one of the graph icons on the left of the domains in the results and you will see this: http://bit.ly/1GdbEyS
I hope this helps!
PS. There is a whiteboard friday about this subject too: http://moz.com/blog/understanding-and-applying-mozs-spam-score-metric-whiteboard-friday
-
RE: Sitemap issues 19 warnings
Hi Justin,
In that case I would ask your developer to make the sitemap on the website update automatically (or generate a new one every day). And submit that link to webmaster tools. If he's a real genius he could add your blog pages from Wordpress to this sitemap aswell but I'm not sure if Wordpress has a hook for this.
Alternative options:
- Let him make the automatically updated sitemap for the custom part of the website and use this combined with the sitemap from the yoast plugin. You can upload both separated in Google Webmaster Tools. Make sure both got their own URL. In this case it’s all automated and is just as good as the previous method.
- Keep on updating your sitemap manually. Just make sure you don't use the yoast sitemap and include the blogposts in your sitemap from screaming frog since this would give double input. If you choose to refresh your sitemap manually I would disable the sitemap within the Yoast plugin and use the Screaming frog sitemap which should include your blog pages aswell.
Good luck and let me know if this works for you!
-
RE: Sitemap issues 19 warnings
Hi Justin,
Thanks for the screenshots. Dirk's suggestion about screaming frog should be really helpful. This should give you an insight in the true 404 errors that a bot can encounter while crawling through your internal site structure.
Based on what I see I think your main problem is the manual updated sitemap. Whenever you change a page, add a new one or mix up some categories those changes won't apply to your sitemap. This creates a 404 error while those pages aren't linked to from your website and (without a sitemap) wouldn't give any 404 error messages in Google Webmaster Tools.
I saw you were using SEO by Yoast already, I suggest using their sitemap functionality. That should resolve the problem and save you work in the future since there is no need to manually update your sitemap again.
Let me know if this works!
-
RE: Sitemap issues 19 warnings
Hi Justin,
Could you post a screenshot of the error message and any links pointing to this URL? This way we can identify what pages return a 404. If this are important pages on your website I would fix it right now, if it however are pages you don’t use or your visitors rarely see I would make sure you pick this up with the redesign. No point in fixing this now if things will change in the near future. Besides that, sitemaps help you get your website indexed, releasing this two weeks earlier won’t make a big difference for the number of indexed pages since you won’t change your internal link structure and website authority (both help you get more pages indexed).
About your last point, could you provide me with a screenshot of this as well? When I check zenory.com/sitemap.xml I find the .com sitemap, so that part seems fine.
_PS. I would suggest you change your update frequency in your sitemap. It now states monthly, it’s probably a good idea to set this much faster since there is a blog on your website as well. At the moment you are giving Google hints to only crawl your website a few times a month. Keep in mind that you can give different parts of your website a different change frequency. For example, I give pages with user generated content a much higher change frequency then pages we need to update manually. _
-
RE: Sponsored Backlinks on High P.A. and PR sites
Hi Silvio,
In general I would say that you should focus on getting mentioned on places where you can find your target audience. Just looking at websites with high PA/DA scores won’t be the best fit for your business.
That said, if you buy any sponsored backlinks you should apply the nofollow tag or you risk a penalty. And since nofollow links don’t pass PageRank it won’t help your site ranking any better. It could however be a way to get some eyeballs which could result in earned backlinks. This is however heavily depending on where (relevance) you place a sponsored link and to what resource this link leads.
I hope this helps!
-
RE: Compare sites?
Hi sdwellers,
Tools give you an insight in what might be the cause of ranking problems but they don't tell the whole story about a websites rankings. Algorithms are very complex and they do take in account a lot of different signals. A few signals that are very hard to take into account by any tool.
- Dwell time (the time people spend on your site, and if they do or do not find the information needed)
- The total backlink profile. Moz has its own index (PA/DA are based on this index) which only covers a fraction of all the links. A normal backlink audit would at least combine links from three different sources including the links from Google Webmaster Tools
- Reviews about your business.
If you are serious about beating your competition I would dive deeper in SEO, or hire an expert. That said, I can give you a few tips based on my first impression of your website.
- The black text (on the blue background) above and under your website are very toxic. This is a classic spam tactic which gets you discounted by Search Engines.
- Your website isn’t mobile friendly (this will become an important ranking factor on 21 of April this year (next week). Google has been talking about this date for a while.
- On first sign, I would say you could make your content more engaging. Your competitors website give a whole different feeling. I would take a good look at your design and the use of visual content. Scuba diving is all about a life time experience, keep people hooked to your site and you dwell time (and sales) will go up.
I hope this helps! Good luck, and I would love to hear if you could make a difference the coming months.
_PS. Facebook likes aren’t used to rank website. Although a big social audience can help you get more backlinks which help a lot. _
-
RE: 0 status codes
Hi Kayleigh,
Like Patrick said, just try to recrawl these pages. I assume that will fix the problem.
If that still gives the errors I would make sure to try it again with adjusted speed. Some servers automatically block crawlers (IP’s) that give a spike in there server requests. Screaming Frog can easily trigger this (happened to me aswell). You can set the speed under configuration.
If you give this a try I would go with one thread at a time and 0,5 URL’s a second. That worked for me. When it was a block it could be needed to get your IP address unblocked before you try it again. Good luck!
-
RE: How does Tripadviser ensure all their user reviews get crawled?
Hallo Linklater,
The short answer, internal site structure + domain autority.
For the long answer, could you provide me with an URL example? I tried to duplicate your setting but didn't find the pages you describe.
-
RE: How can I tell Google not to index a portion of a webpage?
Hello Brad,
Just to get your question clear.
I'm I correct that you want a method that does let Google (and other search engines) know a portion of your pages are duplicates while you want both duplicated pages and original pages to rank in the SERP's?
If you could provide us with an example (link) that would help a great deal as well.
-
RE: Best strategy to follow for a single service site
Great addition of Jonathan! I also think a well implemented example will not get affected by this update but identical pages won't do you much good in the future. So if you start with this strategy, make sure you do it good!
-
RE: Best strategy to follow for a single service site
Hi OLLI_M,
I would definitely focus your homepage on the keywords / search intents for your primary keyword. This page is your best change of scoring in your own city and has the possiblility to score in city’s close by.
You don’t have to worry about scoring on your brand because:
- When people search for your brand name they want you. So from an search engines perspective you’re the 100% match with this search intent and you will have a great dwell time and CTR which work in your favor.
- You will probably get enough inbound links with your brand name in it. That’s very natural.
There are probably a lot more reasons why you will score on your own brand name but I hope these make it clear you don’t have to worry about it.
When you determine your website structure you will probably need some insights in the following:
- Are people adding local modifiers like “Car service [city name]”
- What is the search volume on those keywords? Based on this, are there any more city’s in your service area you intent to rank on besides the city you’re located?
- What is the competition on the keywords you intent to rank on?
In a normal situation where you have one physical location and you serve a couple more nearby city’s I would advise you to focus your homepage on the city you’re based in and add level one pages for the city’s in your service area you intent to rank for. Make sure these pages are added to your normal navigation and are interesting for visitors who do either enter your website through your homepage as directly through the local landings pages.
Miriam wrote a great guide for creating those service area’s on Moz. You might want to check it out: http://moz.com/blog/local-landing-pages-guide
You can also check out this project we have been working on some time ago: www.skaya.nl (Dutch website).
When you click on “Waar rijden wij” you will find a list of the city’s they give driving lessons. Every page has some unique tips and information about driving lessons in those specific city’s. In this case we also added the city we’re based in as a level one page. So either our homepage as the local landings page can score on the city were based in.
Two more great resources when you want to start local SEO I would love to share with you:
- http://moz.com/blog/40-important-local-search-questions-answered
- http://moz.com/blog/11-ways-local-businesses-can-get-links
I hope this helps!
PS. When you look at the example there are two improvements for this website. The tabbed content should be shown immediately and there should be less duplicated content like the prices for example. Just so you don’t copy the wrong parts of this tactic
-
RE: Local SEO case with two physical locations
Ryan, my final thanks to you for taking the time to respond! I got what I need to make my decisions.
-
RE: Local SEO case with two physical locations
Thanks a lot Miriam! This definitly helps!
-
RE: Local SEO case with two physical locations
It are day courses and we are the only one that does rents the place.
-
If we want we could place a huge billboard outside
-
It does have our company name on the door
-
We are there around 3 days a week fulltime.
I would definitely say we have the authority to represent this building since it's just a normal office building we rent and we turned it into a classroom, a place to lunch and a small place to do some administration.
So yes, it are classes. But we aren't part of a larger facility and it's our permanent location.
-
-
RE: Local SEO case with two physical locations
Hello Miriam,
Thanks for taking the time to respond. I learned a lot from your previous posts on Moz.
What’s your take on putting up two (or in the future 3 locations) in the footer?
I know it’s a best practice with one location but I’m not sure what will happen when we put two addresses in the footer (especially when we can only claim one local Google + page). We really want to communicate those locations to our clients since it’s really important information for anyone who takes our classes.
Besides that, how would you combine for example 3 pages about the physical locations with unique pages for a other 10 city’s that are in the service area of your business.
Normally I would add those service area pages to the main navigation, but would it make sense to use the same format for service area’s as for places with a physical location? With format I mean the combination of information and unique content based on the interests of those local searchers.
Last but not least, would you say a part time entrepreneur with a physical location that’s only open for 3-4 days a week could claim a local google + page?
I ask this because I want to know the borderline. Since our second location really feels like.. uhmm a legit physical location. We are there every week, are the only business that does rent this place (we pay for the whole month), serve our customers there and we communicate the address very frequently (that’s really needed since there are a few competitors located in the same area). So the only reason why it shouldn’t be a local Google + page is because we are not open the full 5 days a week (besides the phone number which can easily be fixed and I already did recommend to the client since the branding/trust benefits are already enough to switch).
It feels like the Google guidelines are written specific for classic retail companies. In our (niche) industry there are maybe one or two competitors who are open 5 days a week since classes are only given with enough signups.
I hope you can share your view on this case!
-
RE: Local SEO case with two physical locations
Hmm I find it hard to make a decision on this point. I fear that treating this as a brand isn’t optimal for the local SEO and will put the company in a disadvantage over competitors which are “based” in that city. Although the guideline does state “staffed during normal business hours”.
Normally I would say that’s the way to go but in this industry it’s very common to only be staffed when there are courses. And 50% staffed feels like the same as an entrepreneur who has a part time job as well (let’s say a coffee corner which is only open in the weekends). In that case I would say having a local page is just fine.
Decisions, decisions…
What is your view on point 1 and 3?
-
RE: Local SEO case with two physical locations
Thanks for your response Ryan. The client rents this place full time but it isn't always staffed. There are 2-3 courses every week at the location (these take the whole day).
-
Local SEO case with two physical locations
I hope someone can help me make some decisions. I did read a lot about Local SEO lately but I’m not sure what way to go with this client.
Client:
- Service provider with two physical locations (service is provided on the physical location).
- In the coming 12 month there will open 1-2 new physical locations in other cities.
- Has only one phone number. I will try to advise them to get a local phone number for both locations. But they prefer one (mobile) number to keep things simple.
- Clients are willing to travel for the service, since it’s a one day course they take. Current clients do come from a lot of different locations.
- The competition for around 5-6 big cities is pretty low since there aren’t a lot of service providers who deliver these courses.
Questions:
- Should I put both addresses in the footer? It’s a best practice with only one location. I think it’s handy for users with two locations as well but I’m worried about how Google sees this. Also this will get confusing when the client passes 3-4 locations.
- If the client sticks with one mobile phone number, should I make a Google + local page for both physical locations? The Google guidelines clearly state they prefer a local number as much as possible.
- If I add “Our service areas “ to the top navigation and make a unique place page for every city (to rank organic aswell) is it wise to link those local Google + pages to the unique page about this service? Normaly I would go for yes, but I want to add places with and without a physical location under the same navigation.
With just one location I would just focus on that city and add unique pages for the other pages. I’m getting a bit stuck between best practices since the client got opportunities with multiple strategies.
I hope you guys (and girls ) can help!
-
RE: Cant find link google are saying is inserted
Hi John,
Beside the great points mentioned before these suggestions might help you.
- I ones saw a few inserted casino / gambling URL’s pop up in a Google Analytics reports. Might be worth your time to check if there are any new URL’s created.
- Run some security scans on your site to find any vulnerabilities that might have caused the hack.
- Since you didn’t find it in your database, you could check for any hard coded links with the search through files option in Notepad. If the link is hard coded you should be able to find it.
- If you are using a CMS like Wordpress, check if all your plugins and themes are up to date. Poor maintenance could have caused the security breach.
- If your website is hacked, could it be that the hacker changed links to prevent an easy discovery? Might extent your search to more generic spam terms.
As Dirk said, don’t wait too long solving this issue. If you are depending on your SEO and you can’t fix this issue within days I would rather hire an expert then wait a few weeks.
Goodluck!
-
RE: Https vs http two different domains?
Ofcourse!
I think your website isn't ranking on HTTPS at the moment. I couldn't find any HTTPS urls when I used the site:https://ryanyoungdesign.com.au command in Google.
Also, I don't think you have any backlinks pointing at the HTTPS domain. When I check your backlinks MOZ is automaticly showing the results of the HTTP version (see notice in Open Site Explorer on top). When I manualy ask for the links of the HTTPS domain I still get links that point to the normal HTTP domain (I checked a few manualy). So I think there isn't much of a problem there. Althrough I'm curious, were did you get the list that you included? I Couldn't replicate it through Open Site Explorer.
Your latest server provider is right about not setting up HTTPS. That's exaclty were the problem is. Every website has a 443 port and if the server provider doesn't do anything with it it will automaticly search the first configured 443 port. In this case it found the port of a other website on the server that does has HTTPS set up.
Disallowing it in the robots.txt file won't solve the issue of showing a different website. Your server provider should configure this by either showing a blank page (404) or by requesting the HTTP folder through the 443 port. I would prefer the second option since it's the most user friendly.
Hope this helps!