I've implemented server-side rendering for all the pages. You should no longer see "loading skeletons." This will make the site feel more performant hopefully. IMO it feels more natural but doesn't feel faster just yet. It required a fair bit of hacking to get SSR+clientside caching working but it's a good first step for enhancing performance. With this step we can later add incremental static generation (ie periodic pre-rendering of pages) to get things going really fast.
I'm best man in a wedding next weekend and this weekend is the bachelor party so I'm probably not going to get a whole lot done the next few days. My plans when I do get time is to implement an invite/referral system and begin hacking growth more directly. Then get back to other nice to haves like user profiles.
Overall the performance is pretty good :)
I think to make the perceived performance better there are still couple things possible
  • The header boxes don't have predefined width (those could be flex:1), so they keep jittering until the font for logo loads and the btc price loads.
  • I see around 550ms delay after clicking on "recent", 450ms in fetching data and another 100ms in rendering. The rendering could be done faster given how simple the page is. There is a bit jitter in the page after load (I'm assuming coming from the graphql response).
  • You could change the view as soon as you fire the data fetch request (currently it's waiting for the response), this would like improve the perceived speed a lot. Essentially the page needs to react 100ms within me clicking - but the reaction doesn't need to have the full data. 100ms is a homo sapiens sapiens limitation, so no need to get better than that :D
  • You may consider pre-loading the data through ServiceWorker? That said that may be annoying later when you add better sorting options - so my actual suggestion would be to not use ServiceWorker yet, but consider it later for other features.
reply
Thanks for the analysis! How are you measuring performance?
You could change the view as soon as you fire the data fetch request (currently it's waiting for the response), this would like improve the perceived speed a lot.
This is how it worked before, but there were loading skeletons and had someone complain about it. Maybe that was a red herring.
Essentially the page needs to react 100ms within me clicking
That'll be the target. I'll probably have to incrementally statically generate them.
Edit: I'm also occasionally getting a full page reload on navigation which shouldn't happen (it should all be one page routed with react). So something is amiss.
reply
I just use DevTools performance tab and record interactions there. But the performance is already pretty good, so no need to go crazy here :)
To get under 100ms the UI can't make request and wait for response. It's not possible to realistically do that under 100ms, so yeah, you will need some trick with either prefetching, caching older state or something like that. For simple page like this it's not clear to me that server-side rendering will have a notable benefit (I'd expect it will take React 20ms to render DOM and then browser will take another 50-100ms to do Layout either way).
reply
Great job! Its always been fast for me. I'm sure your building for future Stackers appreciate the hard work.
reply