thanks for the thorough response. I was leaning toward leaving it alone for the time being and this helps affirm my decision. I don't think we are going to see much benefit from tampering with it to make it more Googlebot-friendly
- Home
- MJTrevens
Latest posts made by MJTrevens
-
RE: How to solve JavaScript paginated content for SEO
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom.
Example page: https://tulanehealthcare.com/about/newsroom/
When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript.
So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them?
Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination?
If this is indeed a problem we have 2 possible solutions:
- not building the HTML for the next pages until you click on the 'next' page.
- adding parameters to the URL to show the content has changed.
Any other solutions that would be better for SEO?
-
Can some sort of wildcard redirect be used on a single folder path?
We have a directory with thousands of pages and we are migrating the entire site to another root URL. These folder paths will not change on the new site, but we don't want to use a wildcard to redirect EVERYTHING to the same folder path on the new site.
Setting up manual 301 redirects on this particular directory would be crazy. Is there a way to isolate something like a wildcard redirect to apply only to a specific folder?
Thanks!
-
What's the best way to use redirects on a massive site consolidation
We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for.
One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder).
Any ideas on a better solution to this?
-
Can Google Crawl & Index my Schema in CSR JavaScript
We currently only have one option for implementing our Schema. It is populated in the JSON which is rendered by JavaScript on the CLIENT side.
I've heard tons of mixed reviews about if this will work or not. So, does anyone know for sure if this will or will not work.
Also, how can I build a test to see if it does or does not work?
-
Can Google bypass an AJAX link?
On my company's events calendar page when you click an event, it populates and overlay using AJAX, and then the link that is populated in that overlay then takes you to the actual events page.
I see this as a problem with Google because it can't follow the AJAX link to the true event page, so right now nothing on those pages is getting indexed and we can't utilize our schema to get events to populate in the Google rich snippets or the knowledge graph.
Possible solutions I considered:
1. Remove the AJAX overlay and allow the link from the events calendar to go directly to the individual event.
2. Leave the AJAX overlay and try to get the individual event pages directly indexed in Google.
Thoughts and suggestions are greatly appreciated!
-
Impact of Removing 60,000 Page from Sites
We currently have a database of content across about 100 sites. All of this content is exactly the same on all of them, and it is also found all over the internet in other places. So it's not unique at all and it brings in almost no organic traffic.
I want to remove this bloat from our sites. Problem is that this database accounts for almost 60,000 pages on each site and it is all currently indexed. I'm a little bit worried that flat out dumping all of this data at once is going to cause Google to wonder what in the world we are doing and we are going to see some issues from it (at least in the short run).
My thought now is to remove this content in stages so it doesn't all get dropped at once. But would deindexing all of this content first be better? That way Google would still be able to crawl it and understand that it is not relevant user content and therefore minimize impact when we do terminate it completely?
Any other ideas for minimizing SEO issues?
-
RE: Site structure for location + services pages
Thank you Miriam!
This helps clarify some things. I was leaning toward a format like this so I'm glad to have a second opinion!
Now I just need to back it with some empirical evidence!
-
Site structure for location + services pages
We are in the process of restructuring our site and are trying to figure out Google's preference for location pages and services.
Let's say we are an auto repair company with lots of locations and each one of them offer some unique services, while other services are offered by all or most other locations.
Should we have a global page for each service live with a link to the location page for each shop that offers that service?
OR
Should we built a unique page about each service for every location as a subfolder of each location (essentially creating a LOT of sub pages because each location has 15-20 services.
Which will rank better?
-
Does Google have a separate crawler for Javascript and Content?
Someone told me this is true.
Best posts made by MJTrevens
-
Can I use duplicate content in different US cities without hurting SEO?
So, I have major concerns with this plan.
My company has hundreds of facilities located all over the country. Each facility has it's own website. We have a third party company working to build a content strategy for us. What they came up with is to create a bank of content specific to each service line. If/when any facility offers that service, they then upload the content for that service line to that facility website. So in theory, you might have 10-12 websites all in different cities, with the same content for a service.
They claim "Google is smart, it knows its content all from the same company, and because it's in different local markets, it will still rank."
My contention is that duplicate content is duplicate content, and unless it is "localize" it, Google is going to prioritize one page of it and the rest will get very little exposure in the rankings no matter where you are. I could be wrong, but I want to be sure we aren't shooting ourselves in the foot with this strategy, because it is a major major undertaking and too important to go off in the wrong direction.
SEO Experts, your help is genuinely appreciated!
-
Facebook Instant Articles - will it hurt SEO?
I am intrigued by Facebook IA and am thinking that putting some of my company articles on it would be beneficial for marketing. Question is, will it hurt SEO if I take an article that is on my website and also submit it to Facebook IA? Will it register as duplicate content in Google?
Could it have other benefits/risks?
-
Site structure for location + services pages
We are in the process of restructuring our site and are trying to figure out Google's preference for location pages and services.
Let's say we are an auto repair company with lots of locations and each one of them offer some unique services, while other services are offered by all or most other locations.
Should we have a global page for each service live with a link to the location page for each shop that offers that service?
OR
Should we built a unique page about each service for every location as a subfolder of each location (essentially creating a LOT of sub pages because each location has 15-20 services.
Which will rank better?
-
RE: Site structure for location + services pages
Thank you Miriam!
This helps clarify some things. I was leaning toward a format like this so I'm glad to have a second opinion!
Now I just need to back it with some empirical evidence!
-
Impact of Removing 60,000 Page from Sites
We currently have a database of content across about 100 sites. All of this content is exactly the same on all of them, and it is also found all over the internet in other places. So it's not unique at all and it brings in almost no organic traffic.
I want to remove this bloat from our sites. Problem is that this database accounts for almost 60,000 pages on each site and it is all currently indexed. I'm a little bit worried that flat out dumping all of this data at once is going to cause Google to wonder what in the world we are doing and we are going to see some issues from it (at least in the short run).
My thought now is to remove this content in stages so it doesn't all get dropped at once. But would deindexing all of this content first be better? That way Google would still be able to crawl it and understand that it is not relevant user content and therefore minimize impact when we do terminate it completely?
Any other ideas for minimizing SEO issues?
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom.
Example page: https://tulanehealthcare.com/about/newsroom/
When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript.
So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them?
Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination?
If this is indeed a problem we have 2 possible solutions:
- not building the HTML for the next pages until you click on the 'next' page.
- adding parameters to the URL to show the content has changed.
Any other solutions that would be better for SEO?
Looks like your connection to Moz was lost, please wait while we try to reconnect.