Recent

Author Topic: Testing (JSON) Variants  (Read 5280 times)

dbannon

  • Hero Member
  • *****
  • Posts: 2786
    • tomboy-ng, a rewrite of the classic Tomboy
Re: Testing (JSON) Variants
« Reply #15 on: January 21, 2022, 11:51:18 pm »
Frank, while I started with FPjson, I have move one of my projects to jsontools, http://www.getlazarus.org/json

I found it far easier to work with, especially managing bad json without the user seeing the ugly details !

Davo
Lazarus 3, Linux (and reluctantly Win10/11, OSX Monterey)
My Project - https://github.com/tomboy-notes/tomboy-ng and my github - https://github.com/davidbannon

SymbolicFrank

  • Hero Member
  • *****
  • Posts: 1313
Re: Testing (JSON) Variants
« Reply #16 on: January 24, 2022, 05:38:30 pm »
Well, the main problem with imports is that there will be bad and incorrect data in them. That key, that is an integer 99.9% of the time... And I did manage to circumvent most 'illegal conversion errors' by now. Getting the data into a Variant isn't the problem. The problem is assigning the value inside to something typed.

Also, instead of using "upserts" to simply insert the data in large transactions, as was my first intent, that really only works if there are no other queries you want to run in the mean time. Like, looking up key values, moving things from one table to the other, or a lot of other things. Most of the time, after an insert, you need to know the value of the autoincrement key, which requires a commit anyway.

Again, the problem isn't the JSON, that was just an example. There are very many different formats. JSON is a very recent one. It's a small minority overall. CSV variants are probably the most common format. There are plenty that made up their own format. And some are just a bunch of obscure database files without a known format, which have to be deciphered.

But thanks for all the advice!

SymbolicFrank

  • Hero Member
  • *****
  • Posts: 1313
Re: Testing (JSON) Variants
« Reply #17 on: January 25, 2022, 05:01:26 pm »
In retrospect, I think the best way to import data is to shovel it all in the SQL database as-is and use SQL to insert it into the target tables. Although that would require a custom set of tables for each data source.

I had added an extra field to each table, to store the key from the source data (or a counter value if needed). But I should have added an extra field for each foreign key as well. So I could insert the right keys through SQL, instead of through code.

All in all, use variants as little as possible, especially the ones that (might) contain a Date/Time/Timestamp value :)
« Last Edit: January 25, 2022, 07:14:10 pm by SymbolicFrank »

 

TinyPortal © 2005-2018