Hi everyone!
I have an application that uses a few columns backed by the
PostgreSQL JSON data type. I have found a number of JSON-related database posts, but those mostly were concerned about serialising existing data into JSON. In contrast, I am storing some JSON value entered manually by the user in a column that is automatically validated by the DB server.
The problem I have is this - by default Field editor for TZTable does not recognise the column data type and adds those fields as TField instances. Binding those directly to DBMemo result in the inability to save data - an exception is raised before Post => EDatabaseError "Invalid type conversion to String in field [field_name]". Another strangeness in this case is that the default value showing up in the memo is a pair of brackets.
Adding those fields explicitly as a TStringField allows text input, and won't raise any exception during Post, but a click of the refresh button will show that nothing was saved. Seems that for whatever reason, valid JSON entered in the memo is rejected by Postgres. For instance, some quote escaping may be taking place.
Tracing my way through lazarus and zeos code via debugger, I was sadly unable to find the problem. Is there a simple way to find what exact value is sent to the backend and why? And if escaping is distorting the data, is there a way to override the behaviour?
Any help on this topic would be most appreciated.
Best regards,
jhvhs.