thanks for the thorough response. I was leaning toward leaving it alone for the time being and this helps affirm my decision. I don't think we are going to see much benefit from tampering with it to make it more Googlebot-friendly
Posts made by MJTrevens
-
RE: How to solve JavaScript paginated content for SEO
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom.
Example page: https://tulanehealthcare.com/about/newsroom/
When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript.
So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them?
Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination?
If this is indeed a problem we have 2 possible solutions:
- not building the HTML for the next pages until you click on the 'next' page.
- adding parameters to the URL to show the content has changed.
Any other solutions that would be better for SEO?
-
Can some sort of wildcard redirect be used on a single folder path?
We have a directory with thousands of pages and we are migrating the entire site to another root URL. These folder paths will not change on the new site, but we don't want to use a wildcard to redirect EVERYTHING to the same folder path on the new site.
Setting up manual 301 redirects on this particular directory would be crazy. Is there a way to isolate something like a wildcard redirect to apply only to a specific folder?
Thanks!
-
What's the best way to use redirects on a massive site consolidation
We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for.
One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder).
Any ideas on a better solution to this?
-
Can Google Crawl & Index my Schema in CSR JavaScript
We currently only have one option for implementing our Schema. It is populated in the JSON which is rendered by JavaScript on the CLIENT side.
I've heard tons of mixed reviews about if this will work or not. So, does anyone know for sure if this will or will not work.
Also, how can I build a test to see if it does or does not work?
-
Can Google bypass an AJAX link?
On my company's events calendar page when you click an event, it populates and overlay using AJAX, and then the link that is populated in that overlay then takes you to the actual events page.
I see this as a problem with Google because it can't follow the AJAX link to the true event page, so right now nothing on those pages is getting indexed and we can't utilize our schema to get events to populate in the Google rich snippets or the knowledge graph.
Possible solutions I considered:
1. Remove the AJAX overlay and allow the link from the events calendar to go directly to the individual event.
2. Leave the AJAX overlay and try to get the individual event pages directly indexed in Google.
Thoughts and suggestions are greatly appreciated!
-
Impact of Removing 60,000 Page from Sites
We currently have a database of content across about 100 sites. All of this content is exactly the same on all of them, and it is also found all over the internet in other places. So it's not unique at all and it brings in almost no organic traffic.
I want to remove this bloat from our sites. Problem is that this database accounts for almost 60,000 pages on each site and it is all currently indexed. I'm a little bit worried that flat out dumping all of this data at once is going to cause Google to wonder what in the world we are doing and we are going to see some issues from it (at least in the short run).
My thought now is to remove this content in stages so it doesn't all get dropped at once. But would deindexing all of this content first be better? That way Google would still be able to crawl it and understand that it is not relevant user content and therefore minimize impact when we do terminate it completely?
Any other ideas for minimizing SEO issues?
-
RE: Site structure for location + services pages
Thank you Miriam!
This helps clarify some things. I was leaning toward a format like this so I'm glad to have a second opinion!
Now I just need to back it with some empirical evidence!
-
Site structure for location + services pages
We are in the process of restructuring our site and are trying to figure out Google's preference for location pages and services.
Let's say we are an auto repair company with lots of locations and each one of them offer some unique services, while other services are offered by all or most other locations.
Should we have a global page for each service live with a link to the location page for each shop that offers that service?
OR
Should we built a unique page about each service for every location as a subfolder of each location (essentially creating a LOT of sub pages because each location has 15-20 services.
Which will rank better?
-
Does Google have a separate crawler for Javascript and Content?
Someone told me this is true.
-
RE: How do I outrank Monster and ZipRecruiter in Google for Jobs?
If it were up to me I would drop our own Google for Jobs and let the recruiter sites do it for us. Unfortunately, I'm not the final decision-maker and despite my recommendations the company continues to push its own job postings.
In my opinion, unless we drastically change the job postings content, we aren't going to outrank the giants.
-
How do I outrank Monster and ZipRecruiter in Google for Jobs?
My company has recently started using Google for Jobs to post all of it's available positions. Unfortuntely, they also list these same jobs on third party job boards like Monster.com, ZipRecruiter, Lensa, etc.
Some people are wondering why we aren't outranking these third party sites. My explanation has been that those sites are specifically made for job searching and have much better authority with Google, so they are going to always outrank us.
Aside from taking down the third party posts, is there anything we can do to get our jobs higher in the rankings for Google for Jobs?
Very appreciated!
-
What to do when you have maxed out GTM container
Unfortunately, we have maxed out our capacity for our GTM container because we are using the same container for many websites. We need the ability to add new tags though without deleting old ones.
As a temporary patch, we are creating a second container to push to all sites so we can run both containers simultaneously. Our biggest concern with this is slowing down page load speed because of having all tags from both containers firing on pages.
What are the issues with having multiple containers on our sites?
What would be the best way to do this long term?
-
Anyone have a good process for Schema.org auditing?
I am looking to do a Schema.org audit across a large number of websites that all run on the same platform. I'm not really sure where to start and what format to use for a deliverable. I suppose starting by checking for errors on the current schema and documenting them and then moving on to additional schema that could be added to the JSON+LD?
My last structured data audit I just used a spreadsheet and it didn't come out as neat as I would have liked.
Anyone who has some experience in this, your input would be much appreciated!
-
RE: Any way to force a URL out of Google index?
Thank you! The redirect was my suspicion as well. It's the last issue that could be causing this, thank you for affirming.
-
Any way to force a URL out of Google index?
As far as I know, there is no way to truly FORCE a URL to be removed from Google's index. We have a page that is being stubborn. Even after it was 301 redirected to an internal secure page months ago and a noindex tag was placed on it in the backend, it still remains in the Google index.
I also submitted a request through the remove outdated content tool https://www.google.com/webmasters/tools/removals and it said the content has been removed. My understanding though is that this only updates the cache to be consistent with the current index. So if it's still in the index, this will not remove it.
Just asking for confirmation - is there truly any way to force a URL out of the index? Or to even suggest more strongly that it be removed?
It's the first listing in this search https://www.google.com/search?q=hcahranswers&rlz=1C1GGRV_enUS753US755&oq=hcahr&aqs=chrome.0.69i59j69i57j69i60j0l3.1700j0j8&sourceid=chrome&ie=UTF-8
-
I have never seen this in Google SERPs before?
What is this that I am seeing in the Google SERPs? It's a bank of 3 separate articles from one site, but they are grouped together like an organic listing.
Checkout the link to see what I'm talking about http://imgur.com/xbUk1NG
-
Should I use the Change of Address in Search Console when moving subdomains to subfolders?
We have several subdomains for various markets for our business. We are in the process of moving those subdomains to subfolders on the main site.
Example: boston.example.com will become example.com/boston
And seattle.example.com will become example.com/seattle and so on.
It's not truly a change of address, but should I use the change of address tool in GSC for all of these subdomains moving?
-
Will 301 Redirects Slow Page Speed?
We have a lot of subdomains that we are switching to subfolders and need to 301 redirect all the pages from those subdomains to the new URL. We have over 1000 that need to be implemented.
So, will 301 redirects slow the page speed regardless of which URL the user comes through? Or, as the old urls are dropped from Google's index and bypassed as the new URLs take over in the SERPs, will those redirects then have no effect on page speed?
Trying to find a clear answer to this and have yet to find a good answer
-
RE: Can I use duplicate content in different US cities without hurting SEO?
Thanks for the response Roman. Totally agree with you on this, but if you canonical all but one of the pages, then those pages will be dropped from the Google index right? Google will almost always display the original version regardless of location. And therefore those pages will not reap an organic traffic. That pretty much puts us back at the starting point.
I guess to simplify the question. If we use duplicate content on several sites various locations, is there any way we can get all those pages to rank in their respective markets?
-
Facebook Instant Articles - will it hurt SEO?
I am intrigued by Facebook IA and am thinking that putting some of my company articles on it would be beneficial for marketing. Question is, will it hurt SEO if I take an article that is on my website and also submit it to Facebook IA? Will it register as duplicate content in Google?
Could it have other benefits/risks?
-
Can I use duplicate content in different US cities without hurting SEO?
So, I have major concerns with this plan.
My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service.
They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank."
My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction.
SEO Experts, your help is genuinely appreciated!
-
Is there an easy way to switch hundreds of websites to https in GSC?
My company has hundreds of websites setup in Google Search Console but will soon be moving them all to secure domains. Is there an easy way to make the switch in GSC or do we have to change the address one by one?
-
Soft 404's on a 301 Redirect...Why?
So we launched a site about a month ago. Our old site had an extensive library of health content that went away with the relaunch. We redirected this entire section of the site to the new education materials, but we've yet to see this reflected in the index or in GWT. In fact, we're getting close to 500 soft 404's in GWT. Our development team confirmed for me that the 301 redirect is configured correctly.
Is it just a waiting game at this point or is there something I might be missing?
Any help is appreciated. Thanks!
-
RE: Should I NoFollow Links Between Our Company Websites?
They're actually just linked with the name of the business which is generally a mix of brand, locatio and specialty. No manipulative keyword anchor text at all.
-
Should I NoFollow Links Between Our Company Websites?
The company I work for owns and operates hundreds of websites throughout the United States. Each of these is tied to a legitimate local business many times with specific regional branding and mostly unique content. All of our domains are in pretty good shape and have not ever participated in any shady link building/SEO.
These sites currently are often linking together between the other sites within their market. It makes perfect sense from a user standpoint since they would have an interest in each of the sites if they were interested in the specific offering that business had. My question is whether or not we should nofollow the links to our other sites. Nothing has happened from Google in terms of penalties and they don't seem to be hurting our sites now as they are all currently followed, but I also don't want to be on the false positive side of any future algorithm updates surrounding link quality.
What do you think? Keep them followed or introduce nofollow?