Without prerender.io, is google able to render & index geographical dynamic content?
-
One section of our website is built as a single page application and serves dynamic content based on geographical location.
Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would this be duplicate content or bad SEO?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Somebody took an article from my site and posted it on there own site but gave it credit back to my site is this duplicate content?
Hey guys, This question may sound a bit drunk, but someone copied our article and re-posted it on their site the exact article, however the article was credited to our site and the original author of the article had approved the other site could do this. We created the article first though, Will this still be regarded as duplicate content? The owner of the other site has told us it wasn't because they credited it. Any advice would be awesome Thanks
White Hat / Black Hat SEO | | edward-may0 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Is Google not Penalizing aggressively anymore for on page manipulation?
I wanted to throw this out where we have been seeing so much emphasis on Google cracking down on bad linking, have they let up enforcement on manipulative on-page tactics that have faded in current years? I've been seeing hidden text popping up again and ranking. Here is an example. Google "landscaping Portsmouth NH" and find the #1 result. Now find "Portsmouth" on the page. So what I find interesting, the site has a clean backilnk profile, but that's a pretty blatant manipulation hiding those keywords. What I find interesting is I filled out a report on it a year ago. (I'm not a big "fill out spam report" guy, I was curious if Google would take action). A year later it is still #1 for the competitive keyword. So I'm curious if others have seemed similar trends like font-size:0px, or text color as the background popping back up and ranking. I would love other's thoughts on it.
White Hat / Black Hat SEO | | BCutrer0 -
Partial Match Penalty Site - Move Portion & Redirect To New Site
So I have a site that currently has a partial match penalty from google, I have been working to get it removed...Bad SEO basically my site was submitted to a bunch of bad blog networks..Hopefully it gets lifted soon as we remove and disavow links. That said I was planning on moving a portion of my site to a new site since its not really the focus of the site anymore however still pays the bills. I have also have been building it more of a network of sites..So If I do that and 301 redirect the pages I moved, will the penalty carry? On the current site I planned on using Rel no follow to any links that I may change in the header/menus etc.. Some of these pages I believe have the penalty while others dont. I really just dont want to screw anything else up more then it is? My biggest fear is that its perceived as a blackhat method or something like that? Any thoughts?
White Hat / Black Hat SEO | | dueces0 -
How do I know what links are bad enough for the Google disavow tool?
I am currently working for a client who's back link profile is questionable. The issue I am having is, does Google feel the same way about them as I do? We have no current warnings but have had one in the past for "unnatural inbound links". We removed the links that we felt were being referred to and have not received any further warnings, nor have we noticed any significant drop in traffic or rankings at any point. My concern is that if I work towards getting the more ominous looking links removed (directories, reciprocal links from irrelevant sites etc.), either manually or with the disavow tool, how can I be sure that I am not removing links that are in fact helping our campaign? Are we likely to suffer from the next Penguin update if we chose to proceed without moving the aforementioned links? or is Google only likely to target the serious black hat links (link farms etc.)? Any thoughts or experiences would be greatly appreciated.
White Hat / Black Hat SEO | | BallyhooLtd0 -
Backlinks According to Google
Good Morning, Google has just recognized some links going to my site. I used a seo toolbar downloaded from firefox that informed me of the Links according to Google. My question is that them links have been there for ages and Google has only just recognized them. Is there a reason for this? Does Google only show links quarterly or half yearly? Thanks SEO_123
White Hat / Black Hat SEO | | TWPLC_seo0 -
How google treats RSS fetcher?
All I want to know how google treats RSS fetcher. I want to push my blogs to my own website. Both are there on the same domain . But I want them to be updated automatically on the home page of my website through RSS fetcher if i create it on my blog page. My site name is http://www.myrealdata.com and my blog site name is http://www.myrealdata.com/blog
White Hat / Black Hat SEO | | SangeetaC0