I think we are taking about websites that can be hosted on a static file server like Amazon S3/CloudFront, but which also have all the dynamic features of React available for building a rich and complex UX. For instance, when working with Phenomic (which uses Webpack), it's very easy to add any necessary React components to your project simply with npm install / yarn add. So it's really a marriage of Markdown based content publishing and interactive client-side UX.
I eternally have this misunderstanding. I think static means unchanging.
In this context, it means static HTTP responses, but dynamic HTML/DOM.
I want a word to mean really static.
I'm in the core Phenomic team.
The idea in Phenomic is that we generate for each page:
- the HTML entry point, rendered like it would be with ReactDOMServer
- the data-requirements of the page
That lets us offer the following:
- Any page is accessible directly, without runtime (which has advantages regarding performance and SEO)
- Navigation works even if JS is disabled
Phenomic really is static, it just uses a few techniques that we learnt from the modern front-end development workflows and capabilities :)
Neat. Thanks for the info!
In this context some HTTP responses are static. The static part of the website.
But the React app can also start consuming all kinds of webservice.
This way (1) a large part of content can be served as prerendered static HTTP responses (HTML, CSS, JS). (2) Static content not initially loaded can be served as JSON (as seen in the presentation). And (3) dynamic content (e.g. real time comments or a chat) can be non-static (one or more webservices, possibly on WebSocket thereby going beyond HTTP).
I think this may be very interesting when "big mostly static content", "need for modern web tech" and "big traffic" meet.
Yes, the word static is a bit overloaded here.
Misused, rather. This is dynamic pages, client side.
I think the whole benefit of a static website generator in general is to avoid the "I copy/paste the same header/footer on each" step, especially copy-pasting the menu/header, as your menu can change as you add new pages / blog posts.
I can see good benefits of having a way to do static site generation with the React ecosystem if you're already familiar with it.
> I think the whole benefit of a static website generator in general is to avoid the "I copy/paste the same header/footer on each" step, especially copy-pasting the menu/header, as your menu can change as you add new pages / blog posts.
If the website is simple, you could use iframes for headers and footers.
Except that iframes introduce additional http requests, difficulties with selecting text and all the other annoyances that come with them.
I'm surprised html doesn't have an include tag.
<!--#include virtual="../snippet.html" -->
Whoa. Flashbacks to SHTML.
I'm annoyed but not surprised. This is how a lot of HTML was developed.
First, you have a need. Then you get a crippled native "solution", that emerges from some committee and therefore doesn't really solve anything. In lieu of alternatives, a lot of people use it anyway. Then the problem is ignored long enough so other people come up with hacks to solve the same problem slightly less bad. Then the problem is ignored semi-permanently because 'hey, you can just use one of those hacks!'. (Look up iframe seamless attribute.)
> I just make my bunch of html files (and I copy/paste the same header/footer on each), my css file, eventually 2 lines of JS directly in a <script> tag to toggle a menu on mobile devices
The workflow and output remains largely unchanged, but something like Jekyll would probably make the resulting codebase a little cleaner.
Otherwise, I completely agree. There are better tools for building static sites.
> I copy/paste the same header/footer on each
You could at least use server side includes (SSI) to include those snippets.
Static website generators like Jekyll are pretty cool, though. You should definitely check them out.
I think you're missing the point. You can use all the UI treatments and components, using AJAX if you want to. It basically gives you a lot of options and you're not stuck using jQuery plugins ad hoc when you want to make a customer form or list your latest products with search functionality on a site that is mostly static.
I think the web got complicated when somebody though about users creating content that other user could see. It could have been done with a mailto tag! instead of creating php!
And then install mail client plugins that transform all that information and solve mental poker for N participants! Yes!
Yes, I always hear people raving about how awesome their email client's UX is /s
That'd be me; my email-client has Conway's game of life built in ;)
Found the emacs user! ;-)
Close, but no cigar! I wrote my own mail client, inspired by Mutt but extended with a real programming language (lua):
I added a game of life mostly to prove the extensibility covered the screen modes.
Same, but I'm exceptionally lazy so I use SSI to include the footer/header to avoid having to copy/paste code
> I just make my bunch of html files (and I copy/paste the same header/footer on each), my css file, eventually 2 lines of JS directly in a <script> tag to toggle a menu on mobile devices, I upload it on S3/CloudFront and that's it.
You copy/paste code? This is crazy.
There are often of situations where the long term benefits of automating something isn't worth the up front cost.
Throwing together a small static website with a few pages, in my experience, is one of those situations.
But the up front cost of using something like Jekyll is about fifteen minutes of work. Plus you can find a lot of free templates online that make your basic site not look awful.
I know I've spent hours trying to get Ruby setup and figuring our why bundler install failed, so disagree with your "fifteen minutes" comment. Maybe in the best case scenario.
> "The one that allows you to improve the quality of your code with unit testing and type checking"
Wait, are we really talking about a static website?
I'm a big fan of React and all the JS ecosystem, but when I need a static website, I just make my bunch of html files (and I copy/paste the same header/footer on each), my css file, eventually 2 lines of JS directly in a <script> tag to toggle a menu on mobile devices, I upload it on S3/CloudFront and that's it.
I just started building a blog with Gatsby, after a lot of research it was exactly what I was looking for. As the author of Gatsby, what would you say the major differences are between Phenomic and Gatsby?
I also use Gatsby for a blog, thank you for the software, exactly what I was looking for.
I'm the author of another React static site generator (https://github.com/gatsbyjs/gatsby).
I totally agree that for many sites, using Phenomic or Gatsby would be overkill.
But for something small, it doesn't really matter what tool you use as long as it's familiar. For small projects, familiarity trumps any other concern.
React came out of Facebook that needed a frontend technology that could scale to 1000s of developers.
Gatsby and Phenomic are both an attempt to port the best of the React ecosystem to the world of building web sites. They're designed so that as your website gets larger and you add more people, the code still feels simple and it's still easy to make changes and add new features.
So yes there's a learning curve and some overhead but it really pays off for larger projects.
And the nice thing is that once you understand them, they're now familiar so just as easy to use on small projects as any other solution.
At my company, we had difficulty making our landing pages rank when they were just written in React. We ran an experiment where we took out the interactive parts of it and just put the static content into a html page-- and it performed better.
The other engineers and I were flabbergasted ("it's not supposed to make a difference!") but our SEO expert was not surprised that there was inconsistencies in the Googlebot documentation, and it goes to show that we still don't have a lot of transparency into the google algorithms.
I suspect that, like you say, it was a speed-to-paint thing. But when you're in a crowded keyword space it makes a difference. (As oppose to when you're just trying to rank for your own unique name--- looking at you, Preact ;) )
Search is more than Google. There are many other search engines. And I'm not just talking just about Bing. I'm talking about the likes of Yandex and Baidu with huge user bases. Do you know whether they can crawl SPAs? What about Archive.org? What about new projects that just try to get started and need to crawl the web to do their thing? Saying "I don't care about those" over and over again is not a good enough answer.
> To be clear, this is a really cool project but I fear that many of us throw out keyword spam like SEO and UX to make something more digestible, when the reality is that the performance gain is the real hero!
Yes but the performance gain can have an impact on SEO. I'll have to find the article but I believe the StackOverflow guys figured this out early on and blogged about it.
Do you have proof that JS rendering is equally as friendly to SEO as statically rendered html?
I have been fighting this battle for a while at different places by I can't definitely prove it's true, and most SEO experts that I work with seem to be afraid to upset the apple cart explaining that Google might be good at it, but what about yandex or baidu. I know there are small tests, but has a large corporation where SEO really matters made the switch without a meaningful SEO hit?
On the contrary, all I've ever heard of are folks who start with JS rendering and then add server-side rendering (e.g. with React Server) and report how much better their SEO traffic gets.
That's a bummer, but your SEO experts shouldn't be afraid! They shouldn't let you flip over the whole website either though. You guys should just run a test on one landing page. Yandex and Baidu are pretty ignorable, Google is King.
I remember reading about experiments that indicated that Google can do JS rendering, but the majority of crawler runs don't do it. It's more expensive and thus only done occasionally where necessary. Of course that could be outdated, or an artifact of their measuring methods, or ...
> The result is a website completely SEO friendly that will work for the two people in the whole world who disabled JS
What will it take to dispel the myth that JS rendering isn’t SEO friendly? There seems so much confusion on this point but the fact is that as long as your UI is rendered synchronously (i.e., any data needed for the UI doesn’t have to get fetched from a server), the page will be crawlable by Google.
Further, the goal of isomorphic rendering/rendering static sites in general is not to serve individuals who have JS disabled (that’s a bonus, sure), but rather to remove JS loading and execution from the equation with the end goal of faster paint times for your application.
To be clear, this is a really cool project but I fear that many of us throw out keyword spam like SEO and UX to make something more digestible, when the reality is that the performance gain is the real hero!
The article is about phenomic, a static site building tool for React.
I take back everything I've ever said about web development. Those horrible days are behind us. Thank you to the react, typescript, and phenomic teams. I can easily base my next 5-10 years of web development work off the frameworks and directions you've set out for us.
Browsing static pages is near-instant. No need for React to do that.
Sure there's diminishing returns from ~1-2 seconds page change for a DB-backed site (slower ones anyways) to the ~500 millisecond page change for a normal static site generator to the ~50 milliseconds for a React-based one but it's still a significant difference.
But in any case, client-side routing to me is a nice-to-have not the killer feature for Gatsby. Building web sites with the React component model is the killer feature for me.
Another point in favor of this model is because routing is client-side, page changes are consistently fast. In best-case network conditions, normal static sites are very fast but throw the user on a glitchy 3g network and you'll soon see 5-10 second page loads. The same site built with Gatsby will still see instant page changes as page data is precached.
That's a big advantage, but at the same time you're pushing content to a user that they might not want, or ever see, which could be a problem for users on poor quality 3G networks. If bandwidth is at a premium a developer shouldn't waste it.
"Bandwidth" isn't the problem on 3g generally. What does cause UX problems is high latency. Speculatively precaching content is basically the only solution.
FWIW, while I'm in the US so don't really understand developing for very poor networks, the methods I'm describing is exactly what companies in India and other places with poor networks are adopting: https://developers.google.com/web/showcase/2016/flipkart
This is also a good read https://developers.google.com/web/fundamentals/performance/p...
This claims to improve static websites by adding hot reloading, search, React, the NPM ecosystem, and faster page load.
Search can already be done using Jekyll https://blog.algolia.com/instant-search-blog-documentation-j... (same idea of indexing at build time).
Jekyll can hot reload with jekyll serve.
Anyone who is familiar with React is familiar with HTML and CSS, whereas the opposite is not necessarily true. This means if you are able to use Phenomic, you are able to use Jekyll, but not vice versa.
The NPM ecosystem is useful for authoring templates, not for the end user. If I were writing a Jekyll template I'd probably use webpack and npm, but I don't need that baked into the static site generator itself.
Finally, page load is a really dubious claim, here's some numbers:
On phenomic.io, the HTML for the index page weighs 3K gzipped (measured by copying HTML, removing what can be externalized and cached like styles + scripts, and gzipping index.html)
When using Phemonic to do client-side loading, the JSON for the different pages is from 600B to 2.9K (as seen in network tools when clicking on the links in the top nav).
A page load with phemonic therefore saves you:
* 2KB of bandwidth for some pages
* a few 302 not modified requests for static resources, which are negligible if you use HTTP/2
On the flip side of the coin, phemonic.js, the script bundle which makes all this magic possible, weighs 132KB, ie 44 times the size of the content the user wants to view.
I can't see the value here.
I've used nuxt.js for VueJS and I'm fairly happy with it. They're still pre-v1 release so things are in flux, but it does a lot of heavy lifting and maintains a nice file and folder structure.
Looks pretty slick.
Is just me, or does it look like NextJS uses pure functions?
It's just you.
Or rather, next.js uses React Components. You can write those as pure functions or you can write them as classes to have more fancy features (e.g. setState, lifecycle hooks)
Came here to see if anyone mentioned next.js: https://zeit.co/blog/next
Which fits in with this discussion.
Been working well for us on https://DNSFilter.com
Because if you already use the sledgehammer to build mansions for your day job, using it to build your shed will take no time at all.
That's why I screw all my screws in with a hammer. I'm just so good at nailing things!
The arguments seem pretty clearly presented in the article. What part do you disagree with?
Static sites sit on a large spectrum from very simple to very complex. The nice thing about React is it works very well on very simple sites as well as very complex sites. So yes, for a very simple 5-10 pages site, using React might seem silly but for a site with complex theming and lots of clientside JS, using React makes things a ton easier.
React is a good paradigm to build UI. That's why people like it and use it more and more. If people both like React and need to create lightweight websites, why won't you let them do it?
Why does this feel like using a sledgehammer to hang a painting? I know people like React a lot, but it has no place generating a static website.
How are current static website generators limited? The powerpoint doesn't go into more of why that is.
The biggest win with building a SPA in react or angular is that you can host it on S3 and front it with Cloudfront and never ever worry about it dying. Plus, performance is amazing.
I hugely agree. I've done a lot of web development in a range of circumstances and this current period is frustrating. I just started a new project at a new company; decided to use React; bought and downloaded a React Admin template; spent 2 hours trying to get it to "build"; gave up after I couldn't sort the node_modules conflicts among about 50 modules (totalling nearly 1GB!), each of which had 10+ modules as a dependency.
Angular did have its issues, but I dropped a reference to CDN's version into a web page and was off to the races very quickly.
I'm literally using Makefiles to build, test and deploy a set of about 7 microservices (including their web admin tooling). It's incredibly straightforward, flexible and clearly documented... Gulp, Grunt, WebPack... What are we doing?
>Gulp, Grunt, WebPack... What are we doing
Once you've accepted that JS is here to stay in browser-land (for now, at least), what alternatives are there for build tools?
As a younger dev, when I look at makefiles, I see the same level of complexity and "confusing-ness" that I encountered when I first learned about gulp or webpack configs. I keep reading posts like this on HN where people yearn for simpler days, but weren't the demands of websites and user interfaces simpler back then as well?
I've only been doing professional web development specifically for 4 years or so, so I don't have the same experiences of building in older technologies, but I simply can't imagine trying to build a large, immersive SPA without a modern framework. How do you manage global state or services? Just throw everything in the global namespace? How do you minify your js/css/html? How do you treeshake your unused code? How do you create a local proxied server to avoid CORS issues while developing? How do you autoprefix your css automatically? How do you develop with live-reload functionality? I know all of these can be done individually without these tools, but you can do these all individually without gulp/webpack/etc just by using node/npm scripts. These tools just package them into a more convenient format for common use cases, and tie complimenting tools together. It seems to me that everything would need to be custom built, and you would need to recreate complex scripts from scratch for every new project.
These tools obviously aren't perfect, but I'm not sure they're as bad as you're making them out to be either. Like I said, I don't have the same long-term experience as you or others might, so forgive me if I'm just being naive here.
> bought and downloaded a React Admin template
> Angular did have its issues, but I dropped a reference to CDN's version into a web page and was off to the races very quickly.
Then you're not really comparing apples to apples here. There are <script> loadable versions of react on CDNs that you could have used.
Probably the difference between having Angular being a full MVC framework, while React is simply a View library you combine with other libraries to get what you need.
Combining various libraries can be a pain and sorting out all of the dependencies and node modules like he stated can be cumbersome.
> anyone who wants to build websites of even the simplest kind is expected to wheel in an incredibly large client-side and development stack.
What? Says who? In my humble opinion this sounds to me like a case of "I don't like when people use things that I don't like". Nobody is expecting anyone to do anything for "websites of even the simplest kind". People just use technology they like or think is fun. Nobody is forcing you to like it or use it.
I agree there's an element of "get off my lawn" about me - but it's backed by a genuine fear about what happens to people new to web development.
The impression given by looking around the web for discussions and tutorials is that you have to learn full client-side MVC. This is a huge burden to impose.
The other danger is that people are building sites using these stacks because they never learnt the simpler way to do things. All the complexity gets pulled in without question.
At least us old-timers have the experience to know when there's a positive cost/benefit ratio for all this tech. I'm not sure everyone will in the future.
I get what you're saying, and I do agree that beginners can easily get bogged down in all the different tools, frameworks and libraries, however at the end of the day I feel like beginners just have to learn how to navigate the ecosystem. If a novice is trying to build a webpack powered react/redux SPA on top of kafka without understanding basic programming fundamentals then only their own failure is going to teach them why that isn't such a great idea. At a minimum I think we need to just expect beginners to do a little research on what it is they need to learn. Overambitious novices are not exclusive to the software world, think of a novice guitar player who insists on spending $3000 on pedals, amps and mixing software before they're able to pluck out "twinkle twinkle little star"; some people just need to learn the hard way.
Besides, as you alluded to, having the ability to weigh tradeoffs and make a cost/benefit analysis is a honed skill that takes time and experience to develop and making a few incorrect choices is a critical part of the learning process.
Nobody is forcing you to like it or use it.
This is an ironic reply in the context of an article showcasing a technology that literally obviates that exact issue.
I think these tools (Phenomic, Gatsby) are aimed at front-end developers who are already familiar with React resources and who can leverage that knowledge and the ecosystem of tools to build static sites much the way they build apps.
The bells & whistles are mostly added bonuses - not an argument that all static sites should be built this way. At least that would not be an effective argument in my book.
> We've now reached the point where HTML and CSS served via a normal webserver is regarded as insufficient
We've reached this point in 1995, that's why Rasmus Lerdorf created PHP !
Edit: Oh, I just realised that this is even older than the CSS spec itself (1996). So technically, there has been no point in time where HTML and CSS served via a normal webserver were regarded as sufficient :)
: what is a «normal webserver» anyway ?
You know what I meant. If you're entirely serious then I'll engage but I think you're splitting hairs somewhat. ;-)
On the contrary, you shouldn't feel sad but rather excited. So many people have a false expectation that they must build everything as a SPA, even with all of the glaring UX problems that the SPA model presents.
You should be excited because if you're building a CRUD application that doesn't need to be a SPA, and you choose the traditional server-rendered route... your app is going to be better than the equivalent SPA hog. Your app is going to be better, users will feel that it's better, and they'll give you their money.
Oh, and you don't have to re-invent the wheel with things like routing, so you are saving a lot of development time that the SPA competitor wastes on. Another win for you and your customers, more time to work on features.
But if you could anticipate even some of these requests, and you request ahead of time and deliver the results via DOM mutation, then you might be decreasing net user interface latency.
Incredibly large? Chrome and httpd are each hundreds of times larger than react. If you like simplicity then you should abhor HTML, not React...
Impatient-mode in Emacs can give you automatic page reload as you type.
> anyone who wants to build websites of even the simplest kind is expected to wheel in an incredibly large client-side and development stack
not really, you just seem to be overreacting to someone experimenting with a different way of doing it. I doubt most users would be able to tell the difference when browsing a traditional static site vs. the one in this article. I agree with you that there's a lot of overkill in our industry, but engineers can't be expected to sit on their hands and think up new ideas :)
its funny how page reloads was fine 20 years ago and now with much faster computers and bandwith we try to avoid them.
the advantage thoug is that you do not need a server. even with dynamic state/data the app can be hosted on a CDN or static web server.
Sometimes it's just fun building things in a nice technology.
I don't have time to formulate a proper response but I just wanted to say this makes me sad.
We've now reached the point where HTML and CSS served via a normal webserver is regarded as insufficient and anyone who wants to build websites of even the simplest kind is expected to wheel in an incredibly large client-side and development stack.
Something has gone wrong. All this to avoid a page reload? Should the terrible effects of a page reload not be solved elsewhere?
> without the hacky "pjax" solution.
Before you start using the word 'hacky' you might want a moment of self-reflection.
This is a sensible assumption for a something like a blog, but falls apart quickly on most other sites I've had to build for clients (e.g. more than 1 column, homepages with lots of "modules", etc).
Kind of ironic that React's "big idea" is that it's components all the way down, but when it comes time to structure the content, it's like "nope, just one big monolith of styled text".
(YAML front-matter helps a little bit, but usually is only intended for "metadata", not the primary page content itself.)
Good project, but there is some confusion in what they mean with static website. As I understand it static does not imply no JS (or in other ways changing the page based on user input), but that the server is not rendering it specifically for this request but just serves a page that is ready when the request hits the server.
Just blogged about static site generation with Webpack . This is an React independent approach, you can basically use any renderer that runs on Node.js.
This is brilliant and a niche needed for React. My toolset is React Native, React web apps and now this for simple marketing sites that you can inject functionality into.
Thanks for your work! That is cool stuff for my project. I'm looking for the solution very long time.
I worked with React from first versions. I remember, how I was tried to startup first buggly webpack versoin and lost 2 week at 2014...
The most important thing for me is that content is apart of the technic, in markdown files that are easy to write, easy to read.
The fact that Phenomic is generating HTML makes it almost free to deploy, without any security leak.
Why would you do this?
- to share styling & code between an app and a related static site
- because it's really easy, iff you already have your full front-end dev stack set up already.
Hot reloading is the only feature I could use compared to my self-made script.
That should be done the browser actually. The Epiphany browser does it, but no other.
For search there is Google.
Statically served isn't an opposite of dynamically updated, dynamically fetched and extensible with dynamic parts on the client.
> What if this website will be static but dynamic at the same time?
Well then it wouldn't be static would it?
Doing exactly this right now on a new project using Preact. Working great so far.
This seems like a tough sell. What's wrong with index.html with inline CSS or CDN jQuery you can sprinkle to add dynamic feel? That comes in less than 5kb?
What is the point of over engineering a static website that can be solved with far far less?
If tou havent alraf would tou remi d about reaf jecke esnewz commenys approach to giodlinesitzs? Guidelines!
Earn Online Cash And Recharge http://www.irupes.com/
No, crazy is suggesting Web Components are even remotely suitable for a static site or server-side rendering. Good luck with those if your users have disabled JS.
That is crazy. React is just the current trend but having tried both React and Polymer (Web Components), I'm confident that plain Web Components will win out in the end. You just can't beat simplicity.