I am wondering what the current best practices are on the topic of “Updating supabase collections after insert/update/delete” in the context of real-time CRUD apps (multi-user, simultaneous/collaborative use, need for immediate visibility of changes and avoidance of editing collisions).
My observations have shown that
although realtime is activated on both sides and the collections are already loaded for the first time on page load, an automatic update of the collections after DB transactions does not seem to be guaranteed
even the activation of “return updated rows (all fields)” does not seem to be sufficient in certain cases (at certain times? server load? websockets?)
the only thing that helps reasonably reliably is a time-delayed (due to after insert triggers used in some cases) “fetch collection” request.
Apart from the fact that it doesn’t seem economical to always reload the entire collection straight away, the question of the solution with the least time-lag (context of synchronous collaboration) also arises. And it is not enough to update the collection only with local changes, because all updates of the other collaborating persons are also needed.
Does anyone have the time and know-how to share their own experiences/ best practices? Have I overlooked a central post or a Weweb Doc?
I’ll preface this by saing that Supabase’s realtime is not made for large scale “realtime things” (imo).
Apart from that, you’re most probably facing latency. You should be working with what realtime gives you back and upon that update the rows that are affected. This is very complex to pull of but doable. I don’t think in this case it’s Supabase issue though, I think you just need to put in the work, as it doesn’t come out of the box. Also, realtime doesn’t work with JOINed tables, or VIEWS, so bear that in mind.
TLDR: You shouldn’t be fetching the whole thing each time something changes, but react to the changes. I teach this at my consultations (Consult with Broberto – Broberto).
Thank you very much, Broberto, for your valuable advice. There will certainly come a time when I will also think about paid consulting. Thanks for the link! At the moment, my project is still too much in the feasibility phase. Perhaps you would still like to be a little more specific?
Can you specify from which dimension you are thinking of “large scale realtime things” (concurrent peak connections? table size? etc.) and why you think that supabase does not cover this? (Thanks for the clarification that realtime does not work with JOINed tables or VIEWS. I already thought that and don’t use it that way).
Latency is not my biggest problem (it’s not about chats, visual hints who is working where or the like). It just needs to reliably display the latest data status for all users - let’s say within a few seconds. So it’s not about broadcast or precence tracking, but about economical and reliable updating of Postgres changes…
“You should be working with what realtime gives you back and upon that update the rows that are affected.”: Do I understand you correctly that, in your opinion, Realtime returns “update data”, but Weweb does not/can’t display it and therefore partial updates have to be made (if you don’t always want to fetch everything)?
If you turn on realtime on a table, it should be automatically synced, without need to fetch it again or do any other actions. If you work off of a Subscription to a Database Change, you need to handle this manually.
This is a quite complex topic, which I’d love to talk a lot more about, but I have only so much time to dedicate to the community, so if you need to talk through the feasibility of your specific case, I think a consultation would be beneficial, or hopefully someone else can chip in.
Thank you very much, Broberto, of course I understand that and I am grateful for the tips you have given me so far. As I said, realtime is switched on on both sides and works automatically most of the time without having to fetch the collections manually. But not reliably at ALL times. I would just like to understand the specific parameters that cause the automatic realtime functionality to stop working in these specific situations. Then perhaps it would be a solution to only fetch the entire collection in such situations.
If anyone else here has an explanation, I would be grateful, otherwise thanks for now!
(I’ll probably have to take a closer look at the workflow “On realtime database changes” listening for database changes in the channel to receive database change events…)