Hi
I'd write a little utility, that takes a '.dbf' file (a table) as input param and then
1) Connect to a new 'sqlite3' database, e.g.: named like the project,
or connect to an existing database.
2) Reads the table layout of the 'dbf' and then forms a sql-statement, that will
create a similar(copy of) table in the sqlite3 db and execute it.
3) Runs through the 'dbf' file, reads the data and insert it into the newly
created sqlite3 table.
4) Closes the database and the 'dbf' table/file ...and we're done.
5) "Rinse and repeat" for every 'dbf' table you have.
This is called a /datapump/ utility.
When this is out of the way, you can start on your application and since both technologies (SqlQuery and Tdbf) descends from 'TDataset', this should be /doable/
Have fun
Benny
Be careful with this approach.
We don't know anything about Parent/Child-Tables/Relations (PrimaryKey/ForeignKey) of his design.
That said: Parent-Table before Child-Table
Meaning: Even the order in which you "export/import" any DBF-Tables to SQLite is important.
(Yes, i know there is a way in which you can ignore the order, but ... seriously???)
Might even be worth to check your design/layout of the Database-Structure and rework it if neccessary
EDIT: My Approach.
Create your SQLite-Database with an external tool (DBeaver, DB Browser for SQlite et al).
Create your Tables, Columns, Primary Keys, Foreign Keys, Constraints, Indexes whatever according to the Datatypes used for each Column.
Then use the CSV-Approach mentioned by egsuh exporting the Data in the DBF-Tables (IIRC DBeaver should be able to do that, if not event directly migrating it to SQLite)
and use the "Import CSV" which is available in DB Browser for SQlite.
That way you would be able to catch any "malformed" original Data.
Special care has to be provided for the "difficult" Datatypes, specifically DateTime