My URL is: https://plentific.com/
Hi guys,
About us:
We are running an AngularJS SPA for property search.
Being an SPA and an entirely JavaScript application has proven to be an SEO nightmare, as you can imagine.
We are currently implementing the approach and serving an "escaped_fragment" version using PhantomJS.
Unfortunately, pre-rendering of the pages takes some time and even worse, on separate occasions the pre-rendering fails and the page appears to be empty.
The problem:
When I manually submit pages to Google, using the Fetch as Google tool, they get indexed and actually rank quite well for a few days and after that they just get dropped from the index.
Not getting lower in the rankings but totally dropped.
Even the Google cache returns a 404.
The question:
1.) Could this be because of the whole serving an "escaped_fragment" version to the bots? (have in mind it is identical to the user visible one)?
or
2.) Could this be because we are using an API to get our results leads to be considered "duplicate content" and that's why? And shouldn't this just result in lowering the SERP position instead of a drop?
and
3.) Could this be a technical problem with us serving the content, or just Google does not trust sites served this way?
Thank you very much!
Pavel Velinov
SEO at Plentific.com