What is the feasibility of using SPA, if the content in them is not indexed?

It makes sense to use a different client frameworks, which are fully responsible for the published content (angular, react), if it's not indexed by search engines, and generally not SEO friendly, it turns out the website in this case, it is useless if no one can find it? For example, the same website Builder wix.com content is generated on reacte, and in the end, the original site of this website, you will see that no information, all the content is loaded dynamically.

Came across the term "isomorphic js", ie it solves the problem with the content that the search engines will see this site? Ie I kind of want to try something new, but the point of doing the site, information which will be available to users through search engines, no better then the "old way" to render everything on the server, what are the advantages of SPA in this case?
July 9th 19 at 13:24
5 answers
July 9th 19 at 13:26
An isomorphic framework is just what you need for indexing. By rendering pages on the server side, the search engines will see everything you need. Furthermore, modern search engines quite a learning SPA sites here, such as the Yandex tells how to do it. Google prefers Progressive enhancement

Why do you need? Then, the website was the app (exactly the same as on smartphones and other devices). In situations where your product is not only a website, but also a bunch of other devices, this will allow you to simplify the server part. 1 server and Rest API for all, instead of doing a separate website with its own server, and a separate infrastructure for mobile apps.
July 9th 19 at 13:28
ASP often used in applications that do not need to index content, for example:
Gmail - it is implemented as SPA, agrees that the content keywords no one will look for, right?
Admin Panel - agree that the index admin panel there is no need?
REST api - visual interface, I agree that the indexed pages don't need?
Some private resource, the company is also not indexed.

What's the advantage of using. SPA the application at the same Angular done faster and easier than to expand big and bulky functionality on the server using PHP, Java, C#.
The second is to work with different devices and under different devices is not only a mob. tablets and computers, and engineering software modules(at least so they say in the wiki).
Much less load on the servo and longer on the client because it loads all at once and often works asynchronously. Updating only the content, not all pages entirely.
To make a SPA app in Angular is much more difficult than using PHP server. Not to mention the fact that from a server you still can not get rid of. As a result of the changes on the server will result in changes at the client and Vice versa. - emanuel91 commented on July 9th 19 at 13:31
what simpler and more complex, highly individual topic. If you are surrounded by Misty and you're the one in JS/C# could, then yeah, it is. If the situation is reversed and the division often works with the SPA, they will make faster, not to mention the skills of colleagues. You can make equally bad as on Angular and Php. There are corporate practices and habits of doing development. And I'm not comparing the languages, if that. They are used in different ways and the range of possibilities is different and depends on the project - is the key. - heber_Prohas commented on July 9th 19 at 13:34
July 9th 19 at 13:30
Indexed. But you need to put in some effort.

How to do it? For example googling "hash bang".
If you're talking about content fragments then it is a couple of years as deprecatio. Use server-side prerendering. - emanuel91 commented on July 9th 19 at 13:33
:
PreRender is a method implementation inside. What you do inside the search engines know. - heber_Prohas commented on July 9th 19 at 13:36
: I'm just saying that stuff like the query part of the pages (what you suggested googling) is a deprecated feature of search engines, and not recommended.

Proof: https://webmasters.googleblog.com/2015/10/deprecat... - Allene_Crona78 commented on July 9th 19 at 13:39
:
And I about that:

Method Hasanga implemented many times, is full of examples. And still works well.
PreRender, which do you recommend search engines don't see what you're doing within the site.

If you do that you can say on the matter of how to make odnostranichnikov visible to the search engines - I'd also read. - Weston98 commented on July 9th 19 at 13:42
: I gave you a link, read. - Allene_Crona78 commented on July 9th 19 at 13:45
: I'm afraid the answer is in the style of "Google is" only shows your ignorance of the issue. - Weston98 commented on July 9th 19 at 13:48
:

Do not get smart

Q: My site currently follows your recommendation and supports _escaped_fragment_. Would stop my site getting indexed now that you've deprecated your recommendation? A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you're making the next update for your site. Instead of the _escaped_fragment_ URLs, we'll generally crawl, render and index the #! URLs.


Google has simply expanded the concept of Hasbara for all direct links. - Allene_Crona78 commented on July 9th 19 at 13:51
Right in your quote it says "it is better not to use _escaped_fragment_". And hasbang needed ispolnitelnoe to do this well, as fallback for the history api, which is not needed for quite some time.

So again - use history api, no hash bang. - Weston98 commented on July 9th 19 at 13:54
all the same I will specify why I am totally against hashbang.

Pros hashbang:

- easy to arrange the routing on both the server and the client, one separately - the other separately. Ideal for sites where "routing" need for primitive things.
- works in all browsers where onHashChange

Cons hashbang

- Scales badly - we have to invent has Castile for analog query string, for example if you saw the catalog, or you have a more complicated routing system.
- You cannot organize server prerendering since the anchors do not get in the http request, we can organize server-side prerendering. But it is necessary not only for search engines, but what would the user does not have to wait that extra second to boot the app will work out the routing, the queries on the server... And the need for server-side prerendering will only increase with the proliferation of SPA

That is, for SPA hashbang that did not fit. - Weston98 commented on July 9th 19 at 13:57
July 9th 19 at 13:32
First, if it's a web application (say an online video editor), there is no reason to index anything except landing on the main page.

Second, what makes you think that modern search bots only work with HTTP and HTML and do not know how to implement JS?
Especially a bot from Google, which actually developed your browser, probably the best in the world, and the "engine" V8, this do not bother you too?
Conduct an experiment, make a simple page with AJAX, add a link in the search engines and view, compare with the version without JS.

Thirdly, if the browser does not support JS, we can give the client a version without JS.
July 9th 19 at 13:34
Do auto-generation of static content and file sitemap.xml on the server side.
All.

Find more questions by tags JavaScript