I haven’t been keeping up with SEO but I remember working with google years ago on getting Flash to be more SEO friendly. This included making SWFs dynamically “searchable” by google indexing spiders. They would literally open and play the SWF and navigate through its buttons to identify the content within it. I’m assuming they may do something similar with SPAs out there today, but I’m not sure.
At any rate, its not uncommon to render content - through react - on the backend and push that to the browser on initial requests to help with SEO and improved performance. Then client-side react “hydrates” that content, hooking into what’s there rather than rendering it directly. If this can be done with any route, then spiders can read all the links in your app as independent pages with all expected content.
… and now I’ll do the extra step of googling and… yeah seems like result #1 confirms what I said first, that crawlers can pretty much handle what you throw at them: https://medium.freecodecamp.org/seo-vs-react-is-it-neccessary-to-render-react-pages-in-the-backend-74ce5015c0c9