### Bookstore

 Computer Math and Games in Pascal (preview) Lazarus Handbook

### Author Topic: Import csv. to TChart  (Read 4063 times)

#### flori

• Full Member
• Posts: 196
##### Import csv. to TChart
« on: January 05, 2022, 05:45:40 pm »
Hi everyone!

Does anyone know how to import my data.csv into TChart.

Thank you

#### wp

• Hero Member
• Posts: 9760
##### Re: Import csv. to TChart
« Reply #1 on: January 05, 2022, 06:44:41 pm »
What do you want to achieve?

Your data file contains columns x,y,z, and it seems to that you want to create a 3D plot of points at height z above the xy plane. This is not supported. There is some experimental work about contour and color map plots, though. Search the forum.

Or do you want to plot y vs x and z vs x? See my attached demo for this.

#### flori

• Full Member
• Posts: 196
##### Re: Import csv. to TChart
« Reply #2 on: January 05, 2022, 07:00:37 pm »
I create a small demo with Python.
I create x,y, z data:
x= latitude
y= longitude
z= data

--------------------
x = data[:, 0]
y = data[:, 1] -
z = data[:, 2]-  ..
xll = x.min();
xul = x.max();
yll = y.min();
yul = y.max()

... and interpolate griddata method by Cubic. I tried to kriging too.

#### wp

• Hero Member
• Posts: 9760
##### Re: Import csv. to TChart
« Reply #3 on: January 05, 2022, 07:20:17 pm »
As I said: this kind of mapping is not yet supported. Use the mapinterpolation link on the other post (https://forum.lazarus.freepascal.org/index.php/topic,57507.msg427700.html#msg427700).

#### flori

• Full Member
• Posts: 196
##### Re: Import csv. to TChart
« Reply #4 on: January 05, 2022, 07:30:44 pm »
Yes. I tried your demo. But doesent work with my data.
then I stay with the python.
Thank u.

#### wp

• Hero Member
• Posts: 9760
##### Re: Import csv. to TChart
« Reply #5 on: January 05, 2022, 10:12:33 pm »
I see. That demo was written for data points arranged in a rectangular grid, your x/y positions are completely irregular. Requires some rework of the maths...

#### speter

• Sr. Member
• Posts: 277
##### Re: Import csv. to TChart
« Reply #6 on: January 07, 2022, 12:20:58 am »
Hey Flori, are you trying to visualise the data _or_ are you specifically trying to use TChart?
If you are only interested in TChart, please ignore this post.

I am asking because a simpler approach to visualising the data might be to "reformat" the .csv file and load it into a modelling and/or visualisation program.

At present each line (except the first) has the form:
Code: [Select]
` <x>,<y>,<z>`for example
Code: [Select]
`17.45765,83.80488,4.711421`
It would be quite simple (for example) to write a program to output a file for OpenSCAD (which is a script-based modeller).
Each line could take the form:
Code: [Select]
`'translate(['+s+']) cube([0.025, 0.025, 0.025], true);'`where "s" is the original x,y,z.
Code: [Select]
`translate([17.45765,83.80488,4.711421) cube([0.025, 0.025, 0.025], true);`(See the attachments).

You could alternatively use a more general 3d format (for example DXF)...
I climbed mighty mountains, and saw that they were actually tiny foothills.

Laz 2.2.0 / FPC 3.2.2 / Windows 11 (64bit)

#### wp

• Hero Member
• Posts: 9760
##### Re: Import csv. to TChart
« Reply #7 on: January 07, 2022, 12:25:27 pm »
Here is a similar non-TAChart solution. It uses the excellent gnuplot application. The demo creates a grnuplot script, executes it, catches the gnuplot output and displays it in a TImage of your Lazarus program.

The output has minor differences compared with your screenshot above. This is due to different interpolation (the method can be selected, though) and to not exactly matching color scales.

#### flori

• Full Member
• Posts: 196
##### Re: Import csv. to TChart
« Reply #8 on: January 10, 2022, 10:24:30 pm »
thank you everyone.
The set dgrid3d 200, 200, splines (there I change what interpolation to use?)- inverse distance weight, etc?
demo is woww

#### wp

• Hero Member
• Posts: 9760
##### Re: Import csv. to TChart
« Reply #9 on: January 10, 2022, 10:51:23 pm »
Have a look at the extensive gnuplot documentation which you can download as a pdf (http://www.gnuplot.info/documentation.html).

Here I paste the docs for "set dgrid3d":

Quote
Dgrid3d
The set dgrid3d command enables, and can set parameters for, non-grid to grid data mapping. See splot
grid data (p. 232) for more details about the grid data structure.
Syntax:
set dgrid3d {<rows>} {,{<cols>}}
{ splines |
qnorm {<norm>} |
(gauss | cauchy | exp | box | hann)
{kdensity} {<dx>} {,<dy>} }
unset dgrid3d
show dgrid3d
148 gnuplot 5.4
By default dgrid3d is disabled. When enabled, 3D data read from a le are always treated as a scattered
data set. A grid with dimensions derived from a bounding box of the scattered data and size as speci ed by
the row/col size parameters is created for plotting and contouring. The grid is equally spaced in x (rows)
and in y (columns); the z values are computed as weighted averages or spline interpolations of the scattered
points' z values. In other words, a regularly spaced grid is created and then a smooth approximation to the
raw data is evaluated for each grid point. This approximation is plotted in place of the raw data.
The number of columns defaults to the number of rows, which defaults to 10.
Several algorithms are available to calculate the approximation from the raw data. Some of these algorithms
can take additional parameters. These interpolations are such that the closer the data point is to a grid
point, the more e ect it has on that grid point.
The splines algorithm calculates an interpolation based on thin plate splines. It does not take additional
parameters.
The qnorm algorithm calculates a weighted average of the input data at each grid point. Each data point
is weighted by the inverse of its distance from the grid point raised to some power. The power is speci ed
as an optional integer parameter that defaults to 1. This algorithm is the default.
Finally, several smoothing kernels are available to calculate weighted averages: z = Sum i w(d i) * z i /
Sum i w(d i), where z i is the value of the i-th data point and d i is the distance between the current grid
point and the location of the i-th data point. All kernels assign higher weights to data points that are close
to the current grid point and lower weights to data points further away.
The following kernels are available:
gauss : w(d) = exp(-d*d)
cauchy : w(d) = 1/(1 + d*d)
exp : w(d) = exp(-d)
box : w(d) = 1 if d<1
= 0 otherwise
hann : w(d) = 0.5*(1+cos(pi*d)) if d<1
w(d) = 0 otherwise
When using one of these ve smoothing kernels, up to two additional numerical parameters can be speci ed:
dx and dy. These are used to rescale the coordinate di erences when calculating the distance: d i = sqrt(
((x-x i)/dx)**2 + ((y-y i)/dy)**2 ), where x,y are the coordinates of the current grid point and x i,y i are
the coordinates of the i-th data point. The value of dy defaults to the value of dx, which defaults to 1. The
parameters dx and dy make it possible to control the radius over which data points contribute to a grid
point IN THE UNITS OF THE DATA ITSELF.
The optional keyword kdensity, which must come after the name of the kernel, but before the (optional)
scale parameters, modi es the algorithm so that the values calculated for the grid points are not divided by
the sum of the weights ( z = Sum i w(d i) * z i ). If all z i are constant, this e ectively plots a bivariate kernel
density estimate: a kernel function (one of the ve de ned above) is placed at each data point, the sum of
these kernels is evaluated at every grid point, and this smooth surface is plotted instead of the original data.
This is similar in principle to what the smooth kdensity option does to 1D datasets. (See kdensity2d.dem
for usage demo)
DEPRECATED: A slightly di erent syntax is also supported for backwards compatibility. If no interpolation
algorithm has been explicitly selected, the qnorm algorithm is assumed. Up to three comma-separated,
optional parameters can be speci ed, which are interpreted as the the number of rows, the number of
columns, and the norm value, respectively.
The dgrid3d option is a simple scheme which replaces scattered data with weighted averages on a regular
grid. More sophisticated approaches to this problem exist and should be used to preprocess the data outside
gnuplot if this simple solution is found inadequate.