Competitors have local "mirror" sites
-
I have noticed that some of my competitors have set up "mirror" homepages set up for different counties, towns, or suburbs. In one case the mirror homepages are virtually identical escept for the title and in the other case about half of the content id duplicate and the other half is different. both of these competors have excellent rankings and traffic. I am surprised about these results, does anyone care to comment about it and is this a grey hat technique that is likely to be penalized eventually.
thx
Diogenes
-
They both have massive numbers of backlinks and domains linking in. What do you think?
Looking at the NY site, it has a total of 2300 links from 46 domains. That is not a lot at all. One site wide footer link can offer 10k+ links from a single site. The focus should be on the number of linking domains which is 46 in this case.
When I review the links for this site in OSE, they seem quite natural. There are a couple legal directories but nothing unexpected. The anchor text varies nicely and the link profile is what I would expect to find. For OSE I usually apply the following filters: followed+301, only external, on this root domain, group by domain. Take a look for yourself and I think you will agree there doesn't seem to be anything unusual.
This NY site is not setting up "mirror" sites, but landing pages. Mirror sites are basically identical sites set up under different domains. Landing pages are pages within a site designed to welcome visitors who locate your site through specific search terms or marketing ads.
If this was my SEO client, I would advise them to increase the amount of unique content on their landing pages. I am not comfortable at all with these pages in their current form. With that said, Google apparently is comfortable with them and is indexing these pages.
The site's DA is 31, and PA around 30 for these pages. These sites are very easily beatable with proper SEO work. If I were in your position I would be very pleased to compete with these sites. No matter what site you build, you are going to have competition in SERPs.
What results are you concerned about? I presume you are searching for the exact phrases in their domain name? These types of sites usually do well in domain name match searches, but otherwise they don't fare well. If you offer a basic site with good content and solid SEO, you will solidly beat these types of sites every day.
-
Sorry I didn't make myself clear. These sites are not related but both use the same strategy. Look at the footer links. Incidentally I came accross another thread in this forum where another lawyer (coincidentally) was complaining that the more he optimized the lower his rankings. The answer was that he was creating a lot of duplicate content by trying to set up separate pages for each town and village. Maybe the 2 sites don't work because of the strategy but despite it. They both have massive numbers of backlinks and domains linking in. What do you think?
Paul
-
Thanks for the URLs Paul.
The two examples you offered are two unique websites. I looked at the IPs of both sites and they are completely different. The WhoIs site registration information is completely different. The web design of each site is completely different. The NY site is a wordpress website while the other site seems to be based off a basic site template. Without reviewing the content, there appear to be two completely unique websites.
When I look at the content provided on these pages, it is not duplicated at all. One site has a video, the other does not. They even have different addresses and phone numbers. Even under scrutiny these sites have the appearance they are completely unrelated. If these two sites represent the same company, they certainly did a good enough job of differentiating them to earn them both a place in the search indexes.
-
-
Hi Paul.
You mentioned that half of the content was duplicate, while half was unique. These pages may be unique enough to be considered as original pages. There are many factors involved and it is not possible to share much more in a generic Q&A without reviewing the sites.
To answer your question to Brian, a dynamic web page in this context is one where the content changes based on user information. For example, if you are connecting from Dallas, Texas then the web page would display the weather in Dallas, local news, etc. for that given area. If you then connected to the same site from Miami, Florida the weather and local news would be given for Miami. That would be one example of a dynamic web page.
In short, it is entirely possible to offer localized landing pages for specific areas in a white hat manner. It is of course also possible to do so using black hat techniques. Based on the 50% variance of content and the fact their performance is doing well, it sounds like they may be doing things in an acceptable manner. We need the URLs for the main and local sites to offer further insight.
-
Well, unfortunately I do not know what a dynamic landing page is, or how to tell. So, could you enlighten me and then we would be closer to the answer?
thx
diogenes
-
Are any of these dynamic landing pages?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
"Fake" market research reports killing SEO
Our robotics company is in a fast growing, competitive market. There are an assortment of "market research" companies who are distributing press releases about their research reports (which are of less than dubious quality). These announcements end up being distributed through channels with high domain authority. The announcements mention many companies in the space that the purported report covers - including ours. As a result, our company name and product brand is suffering since the volume of press announcements is swamping our ratings. What would you do? Start writing blog postings on topics and post through inexpensive news feeds? Somehow contact the firms posting the contact and let them know they are in violation of our trademarks by mentioning our name? Other ideas?
White Hat / Black Hat SEO | | amelanson1 -
My site just dropped significant!
Just noticed that my website onlinecasting.co.za just dropped 50+ places on basically all the keywords I'm following.
White Hat / Black Hat SEO | | KasperGJ
I can also see, that today there almost havent been any new sign-ups so something happened.
I didnt change anything. On issue, which might have something to do with it, is that I own several "copies" of the same site, just in different countries (domains). I host the websites myself, and they are all on the same server. The text and design in the same in some of the countries except that "jobs" are unique for the country. I also have:
onlinecasting.ae (english)
onlinecasting.sg (english)
onlinecasting.mx
and more coming So, could that be the reason, that google somehow now decided, that it wont accept the "allmost same site"?0 -
How to add ">" category reveal in google search
When i look through google search and see some website categories their site this way. For example groupon www.groupon.com › Coupons › Browse Coupons by Store how do you do this for a website? for example wordpress. does this help with seo?
White Hat / Black Hat SEO | | andzon0 -
Can I use content from an existing site that is not up anymore?
I want to take down a current website and create a new site or two (with new url, ip, server). Can I use the content from the deleted site on the new sites since I own it? How will Google see that?
White Hat / Black Hat SEO | | RoxBrock0 -
Rank drop ecommerce site
Hello, We're going to get an audit, but I would like to hear some ideas on what could cause our ranking drop. There's no warnings in GWT. We deleted 17 or so blogs (that had no backlinks pointing to these blogs and were simply for easy links) last summer thinking that they weren't white hat so we had to start eliminating them. At the same time, we eliminated a few sitewide paid links that were really strong. With all of this deletion, our keywords started to drop. For example, our main keyword went from first to third/fourth. With the deletions, our keywords dropped immediately a couple of spots, then with no more deletions, all of our keywords have been slowly dropping over the last seven months or so. Right now we are at the bottom of the first page for that same main keyword, and other keywords look similar. We have 70 linking root domains, of which: 15 are blogs with no backlinks that were created simply for the purpose of easy links. We didn't delete them all yet because of the immediate ranking drop when we deleted the last ones. One PR5 site has links to our home page scattered throughout it's lists of resources for people in different states in the US. It doesn't look like a standard paid link site, but it has many paid links in it's different pages. One PR4 site has our logo with another paid link logo at the bottom of one of it's pages. There are 2 other paid links from two PR4 sites that look editorial. There are other links on the sites to other websites that are paid. All links for these 2 sites look editorial. That's all the bad stuff. Other things that could be causing drop in rank - > Our bread crumbs are kind of messed up. We have a lot of subcategory pages that rel=cononical to main categories in the menu. We did this because we had categories that were exactly the same. So you'll drill down on a category page and you'll end up on a main category. To the average user, it seems perfectly fine. Our on-site SEO still has a few pages that repeat words in the titles and h1 tags several times (especially our #1 main keyword), titles similar to something like: running shoes | walking shoes | cross-training shoes where a word is repeated 2 or 3 times. Also, there are a few pages that are more keyword stuffed than we would like in the content. Just a couple of paragraphs but 2 keywords are dispersed in them three times each. The keywords in this content is not in different variations, it's exactly the keyword. We've still got a few URLs that are keywords stuffed with like 3 different keywords. We may have many 404 errors (due to some mistakes we made with the URLs in our cart) - if Google hasn't deindexed them all then we could have dozens of 404s on important category pages. But nothing is showing up in GWT. Our sitemap does not include any broken links. Google is confused about our branding it seems. I'm adding branding to the on-site SEO but right now Google often shows keywords as our branding when Google changes the way the title tag is displayed sometimes in the search engines. We don't link out to anyone. We have lots of content, almost no duplicate content, and some authoritative very comprehensive articles. Your thoughts on what to do to get our rankings back up?
White Hat / Black Hat SEO | | BobGW0 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
How Does This Site Get Away With It?
The following site is huge in the movie trailer industry: http://bit.ly/18B6tF It ranks #3 in Google for "Movie Trailers" and has high rankings for multiple other major keywords in the industry. Here's the thing; virtually all of their movie trailer pages contain copy/pasted content from other sites. The movie trailer descriptions are the ones given by the movie companies and therefor the same content is on thousands of websites/blogs. We all know Google hates duplicate content at the moment... so how does this site get a away with it? Does it's root-domain authority keep it up there?
White Hat / Black Hat SEO | | superlordme0