perfect, just what I needed. I just hope these "old school" ping efforts still work. The client won't let us access their Google Search Console and yet we need their website crawled asap.
Thanks!
Welcome to the Q&A Forum
Browse the forum for helpful insights and fresh discussions about all things SEO.
Job Title: Managing Partner
Company: ToTheWeb LLC
Favorite Thing about SEO
It's Measurable!
perfect, just what I needed. I just hope these "old school" ping efforts still work. The client won't let us access their Google Search Console and yet we need their website crawled asap.
Thanks!
Thank you very much. Do you know where I can find more information about HTTP ping? The google articles don't really provide step by step information on how to do this.
We have a client that will not grant us access to their Google Search Console (don't ask us why).
Is there anyway possible to submit a XML sitemap to Google without using GSC?
Thanks
Excellent answer. Thank you very much.
Yesterday we had a client discover that our staging URLs were being indexed in Google. This was due to a technical oversight from our development team (forgot to upload meta robots tags).
We are trying to remove this content as quickly as possible. Are there any methods in the Google Search Console to expedite this process?
Thanks
I heard a rumor that Adwords Express offers a tool that lets you check real time Marketing Google ranking results (colleague brought this up)
Has anybody heard of this?
We have a enterprise (well known brand) client who is asking about the Google guidelines on embedding reviews from a 3rd party website(s). Essentially the client wants a "summary" of reviews on their landing pages.
We are well aware that the Google best practices do not permit structured data for curated reviews. However are there any guidelines saying that a review summary in general is in violation of webmaster best practices?
Thanks
In the past I have seen most clients create new Google Search Profile when they update to a https URL. However a colleague of mine asked if just updating the change of address option will suffice https://support.google.com/webmasters/answer/83106.
Would it be best to just update the change of address for the Google Search Console profile to keep the data seamless?
Thanks
Hey Paul
Did you get any response after tweeting Google? Thx.
Paul
That was an excellent response. I also appreciate you going out of your way to hit up Google directly about this as well.
Yes we believe that this it is completely unnecessary to employ valuable resources to resolve a very minor issue. However our client would is going to ask us to back our argument.
Thanks again
Thirteen months ago we removed a large number of non-corporate URLs from our web server. We created 301 redirects and in some cases, we simply removed the content as there was no place to redirect to.
Unfortunately, all these pages still appear in Google's SERPs (not Bings) for both the 301'd pages and the pages we removed without redirecting. When you click on the pages in the SERPs that have been redirected - you do get redirected - so we have ruled out any problems with the 301s.
We have already resubmitted our XML sitemap and when we run a crawl using Screaming Frog we do not see any of these old pages being linked to at our domain.
We have a few different approaches we're considering to get Google to remove these pages from the SERPs and would welcome your input.
Thank you.
Rosemary
One year ago I removed a whole lot of junk that was on my web server but it is still appearing in the SERPs.
Recently one of my clients was hesitant to move their new store locator pages to a subdomain. They have some SEO knowledge and cited the whiteboard Friday article at https://moz.com/blog/subdomains-vs-subfolders-rel-canonical-vs-301-how-to-structure-links-optimally-for-seo-whiteboard-friday.
While it is very possible that Rand Fiskin has a valid point I felt hesitant to let this be the final verdict. John Mueller from Google Webmaster Central claims that Google is indifferent towards subdomains vs subfolders.
https://www.youtube.com/watch?v=9h1t5fs5VcI#t=50
Also this SEO disagreed with Rand Fiskin’s post about using sub folders instead of sub domains. He claims that Rand Fiskin ran only 3 experiments over 2 years, while he has tested multiple subdomain vs subfolder experiments over 10 years and observed no difference.
http://www.seo-theory.com/2015/02/06/subdomains-vs-subfolders-what-are-the-facts-on-rankings/
Here is another post from the Website Magazine. They too believe that there is no SEO benefits of a subdomain vs subfolder infrastructure. Proper SEO and infrastructure is what is most important.
Again Rand might be right, but I rather provide a recommendation to my client based on an authoritative source such as a Google engineer like John Mueller.
Does anybody else have any thoughts and/or insight about this?
My client manages 15+ websites and we have observed the spam referrer issue has gotten steadily worse and worse in the last few months. I took the time to weed out the worst offenders via traffic filters but our referral traffic source garden was quickly full of weeds again a few weeks later.
Does anybody know what the Google Analytics development team is doing to combat the spam referrer issue? I find it quite amazing that Google goes through the asinine effort to block out organic keyword data ("not found" & "not provided) but they don't seem to care about blacklisting/filtering spam referrers.
I have just started using Link Detox to determine if our clients have links/domains pointing to their sites that could be harming them in organic search. In a few cases 7%-9% of links have been flagged as a high priority to be disavowed.
I would be interested in your opinion on the following:
If your site does not have a Google penalty is there an advantage to disavowing pages that have been flagged as high risk? When I go to those pages they look spammy and have no real value other than an inbound link.
If a client acquires another website/company and that website is now 301 redirected to the client's site, would "high risk" inbound links from the acquired company cause a problem for my client?
A client has taken down content from their site that was completely unrelated to their current business. Is there a benefit in disavowing those links to the old content that are deemed by Link Detox as being high risk?
Thank you,
Rosemary
Our client has a good number of results showing up in SERPs that are search results pages produced by Blog posts. Unfortunately all these results have exactly the same Title tag and it has nothing to do with the blog content which means they are unlikely to help us much.
We can’t create a 301 redirect because there is no page to redirect. There is no blog page we can re=canonical to either.
The content on these pages is a short list of blog posts by each author. They are not true “Author” pages that would have a URL structure like this: your company.com/author/joeblow
Our plan is to use GWMT's URL removal tool to request remove of these pages. (and then try to stop new results from being created)
We are doing this to get low-value content out of the SERP. Is there a better way to remove these search results? Any drawback in removing them in GWMTs?
Thanks.
We are trying to get more detailed data about where direct website phone calls are coming from (not just Adwords call extensions). We know there are a few services available that do this. However are there any recommendations for a reliable one? We would like to show our clients Google Analytics conversion metrics that shows incoming phone calls from different campaigns.
We have a client that migrated to https last September. The site uses canonicals pointing to the https version.
The client IT team is reluctant to put 301 redirects from the non-secure to the secure and we are not sure why they object.
We ran a screaming frog report and it is showing both URLs for the same page (http and https). The non-secure version has a canonical pointing to the secure version. For every secure page there is a non-secure version in ScreamingFrog so Google must be ignoring the canonical and still indexing the page however, when we run a site: we see that most URLs are the secure version.
At that time we did not change the Google Analytics setup option to use: "https" instead of "http" BUT GA appears to be recording data correctly. Yesterday we set up a new profile and selected "https" but our question is: Does the GAnalytics http/https version make a difference if so, what difference is it?
There has been some debate amongst my colleagues that Facebook obscures it's advertising spend options. This for setting up ads to go directly to a website.
I understand that Facebook offers both CPM & PPC https://www.facebook.com/help/125430440869478. However lets say that an advertiser hypothetical sets up a campaign and for some reason the ad receives absolutely no clicks (but some impressions). Will the advertiser still be charged?
If I recall we used to be able to change our title attributes tag dynamically based on the search query but not sure if it's possible now or if it makes sense to do so.
Thoughts?
Rosemary
Recently I have been promoting custom long form content development for major brand clients. For UX reasons we collapse the content so only 2-3 sentences of the first paragraph are visible. However there is a "read more" link that expands the entire content piece.
I have believed that the searchbots would have no problem crawling, indexing and applying a positive SEO signal for this content. However I'm starting to wonder. Is there any evidence that the Google search algorithm could possible discount or even ignore collapsed content?
Looks like your connection to Moz was lost, please wait while we try to reconnect.