This is awful because all the cool innovations on the web technology front are dependent on dynamically rendered content. Just take a look at the stiff competition that’s heated up between React, AngularJS, Ember.js, and the dozens of other client-side web frameworks.
Eh, maybe. Matt Cutts promised as far back as three years ago that Google can crawl dynamic websites. There’s dozens of Stack Overflow questions about this, and they all seem to be saying varying degrees of “shit if I know.”
In practice, I’ve yet to see Google successfully index our dynamically driven content. Our code isn’t exactly complex either.
There’s various ways of solving this, but they’re all reminiscent of being thrown in a trash-bag with an angry raccoon.
A common solution is to create a parallel set of pages that aren’t dynamically rendered. Redundancy ahoy. Another solution is to use frameworks like Rendr that are designed to execute both on the server- and client-side. That way the content can be rendered server-side when necessary for search engine crawlers. Existing projects can’t leverage this without being rewritten. Further, none of the frameworks that support this are nearly as popular as, say, Ruby on Rails, so there’s some implied risk.
When I said it was magic, I meant the scary witches-brewing-green-shit way, not the unicorns-and-fairy-dust thing. Like seriously, check it out. But it works!
Yusuf is a chef, avid ukulele player, and hip-hop artist. Unfortunately he does all of those poorly, so he sticks to his day job writing software. Prior to The Muse, Yusuf has worked as a developer at companies both big (Microsoft, IBM) and small (dotCloud, Transloc), and as an Associate Product Manager at Google. Find him on github, hacker news, or say hi on Twitter.More from this Author