Recent

Author Topic: In Memory Database  (Read 1723 times)

Weitentaaal

  • Hero Member
  • *****
  • Posts: 524
  • Weitental is a very beautiful garbage depot.
In Memory Database
« on: May 15, 2024, 10:02:47 am »
Hello,

i have a Large Database (SQLITE) and my Application is getting slowed down by it. I Have Indexed the Tables and Selected a Primary Key but its still kind of slow. I was thinking about Copying the Database into RAM and after finishing all tasks, copying it back. How would i achieve this ?

rvk

  • Hero Member
  • *****
  • Posts: 6264
Re: In Memory Database
« Reply #1 on: May 15, 2024, 10:50:11 am »
I Have Indexed the Tables and Selected a Primary Key but its still kind of slow.
Are you sure the indexes are really used for the SQL statement you pass?

You can check this by examining the PLAN (in combination with the actual SQL).
(Those should be visible with a normal DB Manager when executing the SQL)

https://www.sqlite.org/eqp.html


Weitentaaal

  • Hero Member
  • *****
  • Posts: 524
  • Weitental is a very beautiful garbage depot.
Re: In Memory Database
« Reply #2 on: May 15, 2024, 12:09:56 pm »
i got the following as Result: "SCAN TABLE Equip USING INDEX sqlite_autoindex_Equip_1"

The query i used was: "EXPLAIN QUERY PLAN SELECT * FROM Equip ORDER by TeilIndex"

Teilindex is my Primary Key.

what does that mean ? i then continue using TeilIndex in the Data Table which contains all items with its Values and Descriptions. The TeilIndex is the "link" to the Data Table.

rvk

  • Hero Member
  • *****
  • Posts: 6264
Re: In Memory Database
« Reply #3 on: May 15, 2024, 12:17:34 pm »
It means for the SQL "SELECT * FROM Equip ORDER by TeilIndex" the index sqlite_autoindex_Equip_1 is used.
Which is good. That should be fast.

If you say it is slow... then something else is slow.
For example... if you use a different SQL (or at a WHERE to it).
Or... if you do a SQL.Last (which results in fetching ALL records at one).

So it also depends on what you do with that dataset.
If you retrieve it and NEED to traverse through ALL the records... that's always going to be slow.

If you do something with it which results in retrieving too much data (which isn't used directly), then you waste a lot of bandwidth and cpu which isn't needed.

And what do you call "slow" ?
What components are slow getting the data?
(are you using TSQLQuery or another component?)

Weitentaaal

  • Hero Member
  • *****
  • Posts: 524
  • Weitental is a very beautiful garbage depot.
Re: In Memory Database
« Reply #4 on: May 15, 2024, 12:47:40 pm »
I do use TSQLQUERY. i just meassured and it took 8 Seconds to fully load all my data into my Application. i use only Select Statements like: "Select THISANDTHAT FROM TABLE Where INDEX = XY"

after that i do assign my Values like this: ValueXY:= SQLQUERY.Fieldbyname('XY').AsWhatever;
i have a lot objects that get the Data like this.

i create the TSQLQUery Object like this:

Code: Pascal  [Select][+][-]
  1. initialization
  2.    QUERY:= TSQLQuery.Create(nil);
  3. finalization
  4.    QUERY.Free;
  5.  

i know it's not the safest way of doing it like this.

it's just that i have some linked Lists which contain Objects and for each object in each linked list i do load the Data like i showed previous.
I will try to optimize the query's even further. is there anything else i could do ? maybe join the data would be faster ?

rvk

  • Hero Member
  • *****
  • Posts: 6264
Re: In Memory Database
« Reply #5 on: May 15, 2024, 12:58:52 pm »
I do use TSQLQUERY. i just meassured and it took 8 Seconds to fully load all my data into my Application. i use only Select Statements like: "Select THISANDTHAT FROM TABLE Where INDEX = XY"
Yikes. 8 seconds...

after that i do assign my Values like this: ValueXY:= SQLQUERY.Fieldbyname('XY').AsWhatever;
i have a lot objects that get the Data like this.
Yes. But that could be the problem.
For how many objects do you need to fetch that data?

And isn't it easier to do a JOIN and not execute "Select THISANDTHAT FROM TABLE Where INDEX = XY" multiple times?

Every execute is very time consuming.

it's just that i have some linked Lists which contain Objects and for each object in each linked list i do load the Data like i showed previous.
I will try to optimize the query's even further. is there anything else i could do ? maybe join the data would be faster ?
Ha... yes... (I didn't read ahead :) ). JOIN could be much more efficient.
It would depend exactly on what you are doing (hard to say without actual code and DLL from the database).

Khrys

  • New Member
  • *
  • Posts: 20
Re: In Memory Database
« Reply #6 on: May 15, 2024, 01:11:41 pm »
Quote
I was thinking about Copying the Database into RAM and after finishing all tasks, copying it back. How would i achieve this ?

With SQLite3 you can use  sqlite3_serialize  and  sqlite3_deserialize: https://www.sqlite.org/c3ref/serialize.html

I'm using the following in my own code (TSQLite3ConnectionHelper  is a type helper for  TSQLite3Connection):

Code: Pascal  [Select][+][-]
  1. function TSQLite3ConnectionHelper.Serialize(const TargetDB: String = 'main'): TBytes;
  2. var
  3.   Buffer: PByte;
  4.   BufferSize: Int64;
  5. begin
  6.   Result := Default(TBytes);
  7.   Buffer := sqlite3_serialize(Handle, PChar(TargetDB), @BufferSize, 0);
  8.   SetLength(Result, BufferSize);
  9.   Move(Buffer[0], Result[0], BufferSize);
  10.   sqlite3_free(Buffer);
  11. end;

Code: Pascal  [Select][+][-]
  1. function TSQLite3ConnectionHelper.Deserialize(Buffer: TBytes; const TargetDB: String = 'main'): Boolean;
  2. var
  3.   InternalBuffer: PByte;
  4.   Size: Int64;
  5. begin
  6.   Size := Length(Buffer);
  7.   InternalBuffer := sqlite3_malloc64(Size);
  8.   if InternalBuffer = Nil then Exit(False);
  9.   Move(Buffer[0], InternalBuffer[0], Size);
  10.   Result := sqlite3_deserialize(Handle, PChar(TargetDB), InternalBuffer, Size, Size,
  11.                                 SQLITE_DESERIALIZE_FREEONCLOSE or SQLITE_DESERIALIZE_RESIZEABLE) = SQLITE_OK;
  12.   if not Result then sqlite3_free(InternalBuffer);
  13. end;

You'd then want to read the database file's contents into a buffer yourself, deserialize it into an SQLite3 connection that's connected to  :memory:  and after you're done serialize it again and write the new buffer back to the database file

Weitentaaal

  • Hero Member
  • *****
  • Posts: 524
  • Weitental is a very beautiful garbage depot.
Re: In Memory Database
« Reply #7 on: May 15, 2024, 01:22:44 pm »
i will try it with a join and keep the SQL Executes as low as possible. i reduced alreay some time by simply using a join for 2 SELECT's.

is it recommended to search for a record with certain id ?
for example if i join Data of 3 tables in one and then have those 3 Records in the TSQLQUERY stored:

Code: Pascal  [Select][+][-]
  1. ID 1; COLOR Red; NAME Tom
  2. ID 2; COLOR Blue; NAME Steff
  3. ID 3; COLOR Red; NAME Andy
  4.  

now i need the record with the ID 2 for an Object and the next Object needs record with ID 3 and etc. do i need to loop over until i have it or do i have to rethink my Code ?
Or do i use a Select for each in this case ?

Weitentaaal

  • Hero Member
  • *****
  • Posts: 524
  • Weitental is a very beautiful garbage depot.
Re: In Memory Database
« Reply #8 on: May 15, 2024, 01:25:17 pm »
thank you @Khrys for the example i will try it with a sample programm just out of curiosity. i think i can solve my curent problem with join's and a bit of code optimization.

loaded

  • Hero Member
  • *****
  • Posts: 844
Re: In Memory Database
« Reply #9 on: May 15, 2024, 01:25:38 pm »
I have a Postgresql database that is approximately 12 GB in size and contains 2.5 million records. I am using Btree index.
It works at an amazing speed.
Before getting into the RAM issue, I recommend you to review your indexes.
Check out  loaded on Strava
https://www.strava.com/athletes/109391137

Weitentaaal

  • Hero Member
  • *****
  • Posts: 524
  • Weitental is a very beautiful garbage depot.
Re: In Memory Database
« Reply #10 on: May 15, 2024, 01:31:45 pm »
I have a Postgresql database that is approximately 12 GB in size and contains 2.5 million records. I am using Btree index.
It works at an amazing speed.
Before getting into the RAM issue, I recommend you to review your indexes.

its not the indexing that is the problem. It's more likely the Code that can be optimized to get the full advantages out of it.

440bx

  • Hero Member
  • *****
  • Posts: 4216
Re: In Memory Database
« Reply #11 on: May 15, 2024, 02:23:49 pm »
i have a Large Database (SQLITE) and my Application is getting slowed down by it.
What do you call "Large" ?... if it's really large it could easily consume enough memory that the overall system's performance would be negatively affected.

Is your database on an SSD ?... I ask because if it is and queries are taking a long time, it's likely the main problem isn't the access to the media on which the database resides.

Have you tested how long it takes to copy the largest database file to the bit bucket ?... that should give you an idea of the maximum time it should take to perform a query on the table that resides in that file.  If your queries take longer than that then the problem is DBMS structure related which means a faster medium is a band-aid on the symptom and not a solution.

 
(FPC v3.0.4 and Lazarus 1.8.2) or (FPC v3.2.2 and Lazarus v3.2) on Windows 7 SP1 64bit.

Thaddy

  • Hero Member
  • *****
  • Posts: 14787
  • Censorship about opinions does not belong here.
Re: In Memory Database
« Reply #12 on: May 15, 2024, 08:19:40 pm »
A) Slow, you write bad code
B) Slow, you have too much data in memory

Both are programmer error.
bitrate is always calculated like this:sample rate * bitdepth * number of channels.

CharlyTango

  • Jr. Member
  • **
  • Posts: 57
Re: In Memory Database
« Reply #13 on: May 17, 2024, 11:46:03 am »
First of all: What do you call a Big Database?
https://www.sqlite.org/limits.html

In my experience poor database performance results have 3 main reasons

- bad database design

- bad index structure (if an index is used whose data is not organised in small chunks but rather in large blocks (i.e. the same data is often found in the data field) then the database optimiser can also decide to perform a full table scan instead of using the index.

- bad SELECT statements.

A query in an SQL database has several steps. Roughly and not completely:
  • Receive statement
  • Parse statement
  • Decide on access path
  • Search for data
  • Provide data
  • Transfer result data.

And one of the longest steps in terms of time is often transferring the data. This is why the type of programming is crucial for performance.
"SELECT *" is therefore an absolute no-go; you only retrieve the absolute minimum of  data that is actually needed. Concerning rows AND columns.

Another way to speed up the process is to use views, as these do not have to be parsed and analysed every time, which saves time.

Lazarus stable, Win32/64

MarkMLl

  • Hero Member
  • *****
  • Posts: 6850
Re: In Memory Database
« Reply #14 on: May 17, 2024, 11:02:01 pm »
I do use TSQLQUERY. i just meassured and it took 8 Seconds to fully load all my data into my Application.

PMJI as a non-SQLite user.

What do you mean "load data into application"? If you're doing something like extending a dynamic array one record at a time you can bet your ar^H^Hlife it's going to be slow: that's not how you're supposed to use SQL.

Even something like a DB-aware grid: once you start dealing with a table of a billion records or so you very rapidly learn to not blithely attempt to load the whole thing into a graphic control.

MarkMLl
MT+86 & Turbo Pascal v1 on CCP/M-86, multitasking with LAN & graphics in 128Kb.
Pet hate: people who boast about the size and sophistication of their computer.
GitHub repositories: https://github.com/MarkMLl?tab=repositories

 

TinyPortal © 2005-2018