For multi user access, you do have to disconnect the database, to liberate resources. That's actually the proper way to do it.
Connect, use local data set, connect to update, disconnect, etc.

Everybody is trying to re-use connections as much as possible to avoid the authorisation overhead of connecting. Connection pooling was invented for certain type of applications to solve the re-connection problem. You are saying everybody is wrong there?
I am checking right now with SQLite3, when one Lazarus app opens the database, another Lazarus appp can not open it, even if the first one is not writing. It is only because it has it opened.
Again wrong. The database is locked because a transaction has started and a select has been executed (the one to fill in the dbgrid). What is your source of information for all these statements?
As usual, you won't believe me. So here is a simple example of a program with a dbgrid that can be started as many times as you want and allows updating data simultaneously from the different programs, with the usual caveats of updating records that are deleted in another session but as a db expert you should be able to handle that easily. Nothing sqlite specific to that.
procedure TForm1.FormCreate(Sender: TObject);
begin
SQLQuery1.SQL.Text:='create table if not exists test (id integer primary key, firstname varchar(20), lastname varchar(20))';
SQLQuery1.ExecSQL;
SQLQuery1.SQL.Text:='select * from test';
SQLQuery1.Open;
SQLTransaction1.CommitRetaining;
end;
procedure TForm1.BtnApplyClick(Sender: TObject);
begin
SQLQuery1.ApplyUpdates;
SQLTransaction1.CommitRetaining;
SQLQuery1.Refresh;
SQLTransaction1.CommitRetaining;
end;
Key to the operation is the CommitRetaining after every Refresh or Open (select) operation. Normally you wouldn't need the SQLQuery1.Refresh; but you don't want sqldb to do a refresh behind your back and blocking the database again. I'm not saying that you will be able to capture all cases where sqldb does a refresh behind your back (dblookup controls or master detail for example) but the demo shows clearly why the database gets locked.
The main reason, seems to me, that disconnecting the SQLite3Connection disables the dbGrid.
I was expecting that once one has the local data-set, one could disconnect, and still work on the dbGrid, connect back for only the small time necessary to update the database, but this does not seem possible.
If one uses other than DBGrid, perhaps this is achievable.
However, I don't see a reason why disconnecting disallows working with the local dataset and DBGrid.
This has nothing to do with "timing" and "synchronizing", as far as I can tell, since it's mostly a go-no-go situation.
Let me correct this because I wouldn't have other readers pick up on the confusion that reigns in this post.
- dbGrid is not disabled by disconnecting the SQLite3Connection. The dataset is closed when the connection goes away. DbGrid is still enabled but there is no data to show and work with.
- working with a disconnected set is possible with TSQLquery. It inherits from tbufdataset and setting the filename property of tsqlquery before closing the connection allows to continue working with the dataset. To disconnect your connection and continue working with the dataset do
SQLQuery1.FileName:='store.dat';
SQLQuery1.Close;
SQLite3Connection1.Connected:=false;
SQLQuery1.Open;
You can now work on the data without being connected. ApplyUpdates will obviously not work because that is meant to send data to the database back-end. 'store.dat' is not a special name and can be anything. With some precaution you can even stop the program and restart it later and pick up your local changes you made previously.
To reconnect do simply
SQLite3Connection1.Connected:=true;
and you are back on-line. You can call ApplyUpdates and the local changes will be sent to the dabase.
- DBGrid is not related at all to this. Any data aware control behaves the same: closed dataset = no data to display.