Sites Verification Issues
-
We have a group of automotive dealerships by a website provider that causes issues when trying to verify our sites.
Because they use Analytics for their data program, they install a code into our websites-stopping us from doing so properly in our back end. We also cannot verify ourselves in webmasters or adwords. We can't actually "own" any of our sites since they run a java query script from within the website. They also do not allow the use of iframes or scripts, so we can't even use the container to verify these sites.
Any help or insight would be greatly appreciated as I am sure there is some way to break this to get our data and be verified.
-
Yes it is awful. Hopefully we can switch one day, it's not my decision. But thanks, I agree wholeheartedly!
-
This isn't going to be very helpful for your immediate issue, but you need to move the site away from the current website provider. Having a successful website is going to require the ability to have total control over it when you want it. It doesn't have to be expensive, but you need to have the ability to do the things you want to do.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema markup for a local directory listing and Web Site name
Howdy there! Two schema related questions here Schema markup for local directory We have a page that lists multiple location information on a single page as a directory type listing. Each listing has a link to another page that contains more in depth information about that location. We have seen markups using Schema Local Business markup for each location listed on the directory page. Examples: http://www.yellowpages.com/metairie-la/gold-buyers http://yellowpages.superpages.com/listings.jsp?CS=L&MCBP=true&C=plumber%2C+dallas+tx Both of these validate using the Google testing tool, but what is strange is that the yellowpages.com example puts the URL to the profile page for a given location as the "name" in the schema for the local business, superpages.com uses the actual name of the location. Other sites such as Yelp etc have no markup for a location at all on a directory type page. We want to stay with schema and leaning towards the superpages option. Any opinions on the best route to go with this? Schema markup for logo and social profiles vs website name. If you read the article for schema markup for your logo and social profiles, it recommends/shows using the @type of Organization in the schema markup https://developers.google.com/structured-data/customize/social-profiles If you then click down the left column on that page to "Show your name in search results" it recommends/shows using the @type of WebSite in the schema markup. https://developers.google.com/structured-data/site-name We want to have the markup for the logo, social profiles and website name. Do we just need to repeat the schema for the @website name in addition to what we have for @organization (two sets of markup?). Our concern is that in both we are referencing the same home page and in one case on the page we are saying we are an organization and in another a website. Does this matter? Will Google be ok with the logo and social profile markup if we use the @website designation? Thanks!
Local Website Optimization | | HeaHea0 -
How to create sites with powerful individual pages to achieve top results.
How to create sites with powerful individual pages to achieve top results . According to MOZ I need to have powerful individual pages to achieve top results my site has a 0 authority so for this reason I need to focus on powerful pages but how do I know if my pages are powerful or not.
Local Website Optimization | | A.V.S0 -
Benefits of adding keywords to site structure?
Hello fellow Mozzers, This is kind of a hypothetical, but it might have implications for future projects. Do you think there would be any benefits (or drawbacks) to placing pages of a site into a directory named after a keyword? For example, if I had a local store that sold hockey equipment, and "hockey", "equipment", and "hockey equipment" were the main targets being optimized for, would it be better (assuming the actual pages were the same) to structure the site as hypotheticalwebsite.com/about-us/ hypotheticalwebsite.com/hockey-skates/ hypotheticalwebsite.com/hockey-sticks/ hypotheticalwebsite.com/blog/ or hypotheticalwebsite.com/hockey-equipment/about-us/ hypotheticalwebsite.com/hockey-equipment/hockey-skates/ hypotheticalwebsite.com/hockey-equipment/hockey-sticks/ hypotheticalwebsite.com/hockey-equipment/blog/ Additionally, would any of this change if the root domain or the individual pages ALSO used those keywords (or if both of them used it)? pseudonyms-hockey-gear.com/hockey-equipment/skates/ pseudonyms-penalty-box.com/hockey-equipment/hockey-skates/ pseudonyms-hockey-gear.com/hockey-equipment/hockey-skates/ I've got a hunch that some of these are overkill, but I'm not sure where the scale tips from helpful to negligible to actively counterproductive. Thanks, everyone!
Local Website Optimization | | BrianAlpert780 -
What is the best CMS Approach for Multilingual Versions of Site?
We have expanded into France and Brazil and now have a someone in-house that can translate to French and Brazilian Portuguese. I own ".fr" and ".com.br" versions of our domain. We are using Wordpress for our CMS. We are currently publishing about 2 articles a week on English site which we would be translating and publishing through new international sites (when appropriate). We will be changing out photos and videos at times in addition to all the text/copy. So, before I jump deep into this I wanted to reach out for help regarding the best modern approach to this. Should I use some sort of WP Plugin that will let me manage each of these through 1 WP install or is it better to run each separately through multiple WP installs? I want to achieve this while... avoiding any duplicate content penalties. provide easy admin/editor management of publishing content. Any help/advice is greatly appreciated. Thanks!
Local Website Optimization | | the-coopersmith0 -
Can to many 301 redirects damage my Ecommerce Site - SEO Issue
Hello All, I have an eCommerce website doing online hire. We operate from a large number of locations (100 approx) and my 100 or so categories have individual locations pages against them example - Carpet Cleaners (category) www.mysite/hire-carpetcleaners
Local Website Optimization | | PeteC12
carpet cleaner hire Manchester www.mysite/hire-carpetcleaners/Manchester
carpet cleaner hire london
carpet cleaner hire Liverpool patio heater (category)
patio heater hire Manchester
patio heater hire London
patio heater hire Liverpool And so on..... I have unique content for some of these pages but given that my site had 40,000 odd urls, I do have a large amount of thin/duplicate content and it's financially not possible to get unique
content written for every single page for all my locations and categories. Historically, I used to rank very well for these location pages although this year, things have dropped off and recently , I was hit with the Panda 4.0 update which i understand targets thin content. Therefore what I am int he process of doing is reducing the number of locations I want to rank for and have pages for thus allowing me to achieve both a higher percentage of unique content over duplicate/thin content on the whole site and only concerntrate on a handful of locations which I can realistically get unique content written for. My questions are as follows. By reducing the number of locations, my website will currently 301 redirect these location page i have been dropping back to it's parent category.
e.g carpet cleaner hire Liverpool page - Will redirect back to the parent Carpet cleaner hire Page. Given that I have nearly 100 categories to do , this will mean site will generate thousands of 301 redirects when I reduce down to a handful of locations per category. The alternative Is that I can 404 those pages ?... What do yout think I should do ?.. Will it harm me by having so many 301's . It's essentially the same page with a location name in it redirecting back to the parent. Some of these do have unqiue content but most dont ?. My other question is - On a some of these categories with location pages, I currently rank very well for locally although there is no real traffic for these location based keywords (using keyword planner). Shall I bin them or keep them? Lastly , Once I have reduced the number of location pages , I will still have thin content until , I can get the unique content written for them. Should I remove these pages until that point of leave them as it is? It will take a few months
to get all the site with unique content. Once complete, I should be able to reduce my site down from 40,000 odd pages to say 5,000 pages Any advice would be greatly appreciated thanks
Pete0 -
International site, be visible on both .com and .co.uk?
Do you guys have any tips to increase the visibility in both Google.com and Google.co.uk? The site today, have good visibility in USA, but its poor in the UK... Information: The server is based in US. No region is set in the Google Webmaster Tools. Incoming links are from global regions, mostly US. Do we need to add a specific section for the UK (uk.site.com or site.com/uk/) and specify region in GWT to make sure Google handle this the right way? Its a lot of work, rewrite all the content for another section, which also is in english...
Local Website Optimization | | Vivamedia0 -
HELP, My site have more than 40k visits by day and the server is down, I do not want all this visits...
Hello... I have a website for a local spa in ecuador, this website have a blog with some tips about health... and suddenly one of the articles goes viral on south america profiels on FB and I am receiving 40k visits by day from other countries that are not interested to me because my site is for a local bussines in ecuador... I already block some countries by IP , but Im still receiving visits from other south america countries, for this reason My hosting server company put down my website and I can not put it back online beacuse this thousands of visits use more than the 25% of the CPU of the server and the hosting company put down my website again... I really need to know what to do, I do not want to pay for a expensive special server because all this visits from other countries are not interesting to me .and as I said before my bussines is local.
Local Website Optimization | | lans27872 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0