Can I block blexhn30.webmeup.com. Or does it have anything to do with my Moz Local
-
I am getting alot of hits from blexhn30.webmeup.com. My web host says it could be a web service. Is this part of moz local activity? Otherwise I want to block it. Have you seen this before??
-
Thanks Adam!
-
Thank you- just what I needed
-
Hi Julie,
This post might help. It includes some solid advice and techniques to clean up your GA data: https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter.
-
My Google Analytics is full of this kind of stuff too. Is there an easy way to get rid of all of it?
-
Hi Stephen,
Webmeup is a crawler unrelated to Moz. Feel free to block them if you'd like, though they don't seem to be nefarious in intent. They've listed info about their crawler and how best to block them here: http://webmeup-crawler.com/.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International subdirectory without localized content - best practice / need advice
Hi there, Our site uses a subdirectory for regional and multilingual sites as show below for 200+ countries.
Local Website Optimization | | erinfalwell
EX: /en_US/ All sites have ~the same content & are in English. We have hreflang tags but still have crawl issues. Is there another URL structure you would recommend? Are there any other ways to avoid the duplicate page & crawl budget issues outside of the hreflang tag? Appreciate it!0 -
Can I have multiple GeoShape Schema for one page on one domain?
Hi Mozers, I'm working on some Schema for a client of mine, but whilst doing the research on GeoShapes with my developer, we came across a potential issue with this particular mark-up. My client is B2C business, operating in numerous places across the UK. I want to use the Circle property from GeoShape to draw out multiple circles across the UK, but am I able to do this? From looking at some other websites, most seem to just have one GeoShape. Can I have multiple on the same page and same domain? Thanks! Virginia
Local Website Optimization | | Virginia-Girtz0 -
Is there a way to "protect" yourself from non-local traffic?
I'll start with the story, but the main question is at the bottom. Feel free to scroll down :-). I've got good news and bad news regarding a client of mine. It's a service area business that only serves one metropolitan area. We've got a great blog with really valuable content that truly helps people while firmly establishing my client's industry expertise. As a result, local traffic has spiked and the company generates more leads. So that's the good news. The bad (bad-ish?) news is that the client also gets tons of traffic from outside the service area. Not only that, people are calling them all the time who either live in a different state and don't realize that the company isn't local to them or are located out of state but are calling for free advice. On one hand, the client gets a kick out of it and thinks it's funny. On the other hand, it's annoying and they're having to train all their intake people to ask for callers' locations before they chat with them. Some things we're doing to combat this problem: 1. The title tag on our home page specifies the metro area where we're active. 2. Our blog articles frequently include lines like, "Here in [name of our city], we usually take this approach." 3. There are references to our location all over the site. 4. We've got an actual location page with our address; for that matter, the address is listed in the footer on every page. 5. The listed phone number does not begin with 800; rather, it uses the local area code. 6. All of our local business listings, including our Google My Business listing, is up to date. 7. We recently published a "Cities We Serve" area of the site with highly customized/individualized local landing pages for 12 actual municipalities in our metro region. This will take some time to cook, but hopefully that will help. "Cities We Serve" is not a primary navigation item, but the local landing pages are situated as such: "About Us > Cities We Serve > [individual city page]" **Anyway, here's my main question: **In light of all this, is there any other way to somehow shield my client from all this irrelevant traffic and protect them from time-wasting phone calls?
Local Website Optimization | | Greenery0 -
Areaserved json-ld schema markup for a local business that targets national tourism
If there is a local business that thrives on ranking nationally for people searching for their services in that location, do you target the business's actual service areas or target nationally? For instance, a hotel in Denver, Colorado. Would the areaserved markup be: "areaServed":[{"@type":"State","name":"Colorado"},{"@type":"City","name":"Denver"}] Or "areaserved":"USA" The "geographic area where a service or offered item is provided" would be denver, colorado. But we would be looking to target all people nationally looking to travel to denver, colorado. Or would it be best to target it all, like: "areaServed":[{"@type":"State","name":"Colorado"},{"@type":"City","name":"Denver"},"USA"]
Local Website Optimization | | SEOdub0 -
If we are a local based business, what is the best approach to tracking keywords? Shall we be micro tracking?
we are a local based business and we only have one physical property but we service a 15 mile radius (people within a 15 mile radius will use our services) when it comes to keyword tracking and monitoring should we just be looking at the 3 main local towns or should we go out to the villages around our area too? at what level shall we be micro tracking? do we go to such a micro level for tracking keywords for all the villages which creates a lot of keywords for the locations? what is the best approach?
Local Website Optimization | | Mutatio_Digital0 -
All metrics appear to be better than our local competitors yet we our ranking doesn't resemble it. Help?
Hi, I work for a marquee company and have recently been really trying to optimise our SEO through good content, link building, social media especially google + and so on. Yet a rival (www.camelotmarquees.com) who performs worse than us for the majority of the moz parameters still ranks better than us in both organic search and google places. The clear and obvious factor they beat us on is internal links which is currently over 15,000 which seems ridiculous for the size of their site, compared to our site of about 120. Would this have that match of an effect on the rankings and how on earth have they got so many? Also is there any tips and advice to help us leap frog them as we feel, we're producing regular, useful content and optimised our site the best we can? website: www.oakleafmarquees.co.uk keywords: marquee hire dorset, marquee dorset, dorset marquee hire, wedding marquee hire
Local Website Optimization | | crazymoose780 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
Need to access robots.txt to block tags
Hi My website nykb.com is showing up in moz as having multiple duplicate pages because of the tags (each tag generates its own page and since posts have many tags but the same tags are only used once/twice the tag pages are all duplicate pages. I wanted to block the tagpages in robots.txt but cant seem to find access to it- have searched online but havent come up with anything! I do not have access to the ftp folders only the wordpress backend.. should I just remove tags? the posts are grouped by category too.. THANKS
Local Website Optimization | | henya0