Recent

Author Topic: Loosing postgres (TPQConnection) due to local dynamic IP address change  (Read 8126 times)

tatamata

  • Hero Member
  • *****
  • Posts: 804
    • ZMSQL - SQL enhanced in-memory database
I have a desktop application connecting to remote PostgreSQL server, using TPQConnection.
It happens that during very long-running queries connection is lost due to my dynamic IP address changed on router.
Is there something that can be done in the app to prevent this?

MarkMLl

  • Hero Member
  • *****
  • Posts: 8388
Do you really mean "long-running queries" (or for that matter long-running transactions) or visual controls which assume that they can hold a connection open indefinitely?

I'm not sure you really can protect against the former other than finding some way to run detached queries on the server. The latter is, I feel, a problem with any development environment which provides visual controls, and unless something like pgPool would help I'm not sure there's an easy way round it other than avoiding visual controls.

MarkMLl
MT+86 & Turbo Pascal v1 on CCP/M-86, multitasking with LAN & graphics in 128Kb.
Logitech, TopSpeed & FTL Modula-2 on bare metal (Z80, '286 protected mode).
Pet hate: people who boast about the size and sophistication of their computer.
GitHub repositories: https://github.com/MarkMLl?tab=repositories

colo

  • New Member
  • *
  • Posts: 48
If something changes the peer address (from the server's perspective) right under your established TCP connection, there is nothing you can do on the NATted client to prevent this session from breaking.

tatamata

  • Hero Member
  • *****
  • Posts: 804
    • ZMSQL - SQL enhanced in-memory database
Do you really mean "long-running queries" (or for that matter long-running transactions) or visual controls which assume that they can hold a connection open indefinitely?

I'm not sure you really can protect against the former other than finding some way to run detached queries on the server. The latter is, I feel, a problem with any development environment which provides visual controls, and unless something like pgPool would help I'm not sure there's an easy way round it other than avoiding visual controls.

MarkMLl
Actually all of that: there are visual controls (for example dbgrids) bound to Postgres tables, and also there are long-running fuzzy data matching queries. that can have runtime for hours or even days.

tatamata

  • Hero Member
  • *****
  • Posts: 804
    • ZMSQL - SQL enhanced in-memory database
Let me share some replies from chatGPT.
"
Your assumption that the dynamic IP address change on your router might be causing the loss of connection to the remote PostgreSQL server is justified. Let me explain why.

When your computer connects to the internet via a home router, it typically receives an IP address from your internet service provider (ISP) dynamically. This means that the IP address can change over time, especially if your internet connection drops or you restart your router.

In the context of connecting to a remote PostgreSQL server from your desktop application, you would typically configure the connection settings with the IP address of the server. If your router's IP address changes while your application is running, the connection between your application and the PostgreSQL server can be disrupted.

In this scenario, your long-running query may still be executing on the PostgreSQL server, but your application's connection to the server is lost due to the IP address change. As a result, your application won't be able to receive the query results or maintain any ongoing communication with the server.

To address this issue, you could consider a few potential solutions:

1. Static IP address: Contact your ISP to inquire about obtaining a static IP address for your internet connection. A static IP address doesn't change, allowing your application to consistently connect to the PostgreSQL server.

2. Dynamic DNS: Configure a dynamic DNS (Domain Name System) service that associates a domain name with your changing IP address. You can then use the domain name in your application's connection settings instead of the IP address. The dynamic DNS service will update the domain name with the new IP address whenever it changes.

3. Reconnect logic: Implement logic in your application to periodically check the connection status and reconnect if it's lost. This way, if your IP address changes, your application can reestablish the connection to the PostgreSQL server automatically.

By implementing one or a combination of these solutions, you can mitigate the issue of losing connection to the remote PostgreSQL server due to a dynamic IP address change.
"

"
Yes, there are a few things you can do within your Lazarus/FreePascal desktop application to mitigate the problem of losing connection to the remote PostgreSQL server due to a dynamic IP address change. Here are a couple of suggestions:

1. Connection monitoring: Implement a mechanism within your application to monitor the connection status continuously. You can periodically ping the server or send lightweight queries to keep the connection alive. If the connection is lost, your application can attempt to reconnect automatically.

2. Connection pooling: Consider using connection pooling in your application. Connection pooling involves creating and managing a pool of database connections that can be reused by multiple client requests. With connection pooling, your application can maintain a set of established connections to the PostgreSQL server, even if the IP address changes. When a client request comes in, it can use an available connection from the pool rather than establishing a new one.

By implementing these techniques, your application can proactively monitor and manage the connection to the PostgreSQL server, increasing its resilience to dynamic IP address changes and reducing the chances of losing the connection during long-running queries.
"

"
Certainly! Here are code examples demonstrating the two suggested approaches: connection monitoring and connection pooling.

1. Connection Monitoring:
```pascal
uses
  ..., DB, PQConnection;

var
  Connection: TPQConnection;

procedure MonitorConnection;
begin
  if not Connection.Connected then
  begin
    try
      Connection.Open;
      Writeln('Connection reestablished.');
    except
      on E: Exception do
        Writeln('Error reconnecting: ' + E.Message);
    end;
  end;
end;

begin
  Connection := TPQConnection.Create(nil);
  try
    Connection.HostName := 'your_postgresql_server_address';
    Connection.DatabaseName := 'your_database_name';
    Connection.UserName := 'your_username';
    Connection.Password := 'your_password';

    // Set up additional connection properties as needed

    while True do
    begin
      MonitorConnection;
      // Sleep or perform other tasks
    end;
  finally
    Connection.Free;
  end;
end.
```

In this example, we define a `MonitorConnection` procedure that checks if the connection is lost (`Connection.Connected` returns `False`). If the connection is not established, we attempt to reconnect using `Connection.Open`. You can customize this code to suit your specific needs, such as adding sleep intervals between connection checks.

2. Connection Pooling:
```pascal
uses
  ..., DB, PQConnection, Data.DBConnectionPool;

var
  ConnectionPool: TDBConnectionPool;

procedure InitializeConnectionPool;
var
  ConnectionFactory: TConnectionFactory;
begin
  ConnectionFactory := TConnectionFactory.Create(
    function: TCustomConnection
    begin
      Result := TPQConnection.Create(nil);
      Result.HostName := 'your_postgresql_server_address';
      Result.DatabaseName := 'your_database_name';
      Result.UserName := 'your_username';
      Result.Password := 'your_password';
      // Set up additional connection properties as needed
    end);

  ConnectionPool := TDBConnectionPool.Create(ConnectionFactory);
  ConnectionPool.MaxConnections := 10;
end;

procedure UseDatabaseConnection;
var
  Connection: TPQConnection;
  Query: TSQLQuery;
begin
  Connection := ConnectionPool.AcquireConnection;
  try
    Query := TSQLQuery.Create(nil);
    Query.SQL.Text := 'SELECT * FROM your_table';
    Query.Database := Connection;
    Query.Open;
    // Perform your database operations
    Query.Close;
    Query.Free;
  finally
    ConnectionPool.ReleaseConnection(Connection);
  end;
end;

begin
  InitializeConnectionPool;

  // Example usage of the connection pool
  UseDatabaseConnection;

  ConnectionPool.Free; // Remember to free the connection pool when done
end.
```

In this example, we use the `TDBConnectionPool` class from the `Data.DBConnectionPool` unit to create and manage a pool of PostgreSQL connections. The `InitializeConnectionPool` procedure sets up the connection factory, which creates instances of `TPQConnection`. The `UseDatabaseConnection` procedure demonstrates how to acquire a connection from the pool, use it for executing queries, and release it back to the pool when finished.

Please note that these code examples provide a basic structure and may require additional modifications to fit your specific application requirements.
"
"
In a Lazarus/FreePascal desktop application, you would typically place the code within the appropriate event handlers or procedures that are relevant to your application's functionality. Here's a breakdown of where you can put the code in a real application:

1. Connection Monitoring:
   - Place the code that sets up the `TPQConnection` and the `MonitorConnection` procedure in the appropriate unit or module of your application.
   - Consider placing the connection setup code in the application's initialization section or a dedicated initialization procedure.
   - In the main program flow or within a background thread, call the `MonitorConnection` procedure periodically to check the connection status and attempt reconnection if necessary.
   - You can use a timer component or a loop with a sleep interval to control the timing of the connection checks.

2. Connection Pooling:
   - Place the code that sets up the `TDBConnectionPool` and the connection usage code in the appropriate unit or module of your application.
   - Consider placing the connection pool setup code in the application's initialization section or a dedicated initialization procedure.
   - Depending on your application's structure, you can call the `UseDatabaseConnection` procedure in response to user actions, specific events, or within a specific business logic section.
   - Ensure that you properly handle exceptions and release connections back to the pool even in case of errors.

Remember that these examples are just templates, and you should adapt them to fit your application's specific architecture and requirements. You may need to consider thread safety, error handling, and additional components or units that your application uses.
"
« Last Edit: July 16, 2023, 10:24:37 am by tatamata »

MarkMLl

  • Hero Member
  • *****
  • Posts: 8388
Let me share some replies from chatGPT.

We'd rather you didn't, and we'd particularly rather you didn't insert large blocks of code without proper markup.

I'm sorry, but I've made some comments already and while I'm interested in helping my time budget doesn't extend to reading that stuff.

MarkMLl
MT+86 & Turbo Pascal v1 on CCP/M-86, multitasking with LAN & graphics in 128Kb.
Logitech, TopSpeed & FTL Modula-2 on bare metal (Z80, '286 protected mode).
Pet hate: people who boast about the size and sophistication of their computer.
GitHub repositories: https://github.com/MarkMLl?tab=repositories

egsuh

  • Hero Member
  • *****
  • Posts: 1600
AFAIK visual controls do not need to keep connection. You may commitretaing to cut direct  connection.

If you mean remote server a computer on home router then you can fix the internal ip address of the PC.

If remote server means a server that you want to publish on the internet then you may use DDNS.
« Last Edit: July 16, 2023, 01:13:29 pm by egsuh »

MarkMLl

  • Hero Member
  • *****
  • Posts: 8388
They certainly used to, and would get very unhappy if asked to handle e.g. a resize if the connection went down. I have keen memories of a weekend crash caused by Debian updating the PostgreSQL server which necessitated a daemon restart, resulting in all current connections being dropped by the client systems.

That might obviously have changed, or it might be possible to have something as a placeholder which would defer the problem until the next time a query has to be made to (re)populate a visual control with some sort of reconnection at that point. But generally speaking this is one of a number of design flaws which have affected RAD tools since their earliest days of connecting to backend servers.

And there might obviously be OS-specific behaviour regarding existing (TCP) connections if the endpoint IP address changed. Apart from anything else, a change would be very likely to impact on iptables/netfilter/routing etc. entries on Linux, or their equivalent on other OSes.

The bottom line is that the overall SQL/relational architecture did not allow for the case where connection duration was significantly longer than query/transaction duration.

MarkMLl
MT+86 & Turbo Pascal v1 on CCP/M-86, multitasking with LAN & graphics in 128Kb.
Logitech, TopSpeed & FTL Modula-2 on bare metal (Z80, '286 protected mode).
Pet hate: people who boast about the size and sophistication of their computer.
GitHub repositories: https://github.com/MarkMLl?tab=repositories

Zvoni

  • Hero Member
  • *****
  • Posts: 2963
I have a desktop application connecting to remote PostgreSQL server, using TPQConnection.
It happens that during very long-running queries connection is lost due to my dynamic IP address changed on router.
Is there something that can be done in the app to prevent this?
This sounds like a published port on the Internet, which is a Big NO-NO!
Use a WebService to handle your requests
One System to rule them all, One Code to find them,
One IDE to bring them all, and to the Framework bind them,
in the Land of Redmond, where the Windows lie
---------------------------------------------------------------------
Code is like a joke: If you have to explain it, it's bad

af0815

  • Hero Member
  • *****
  • Posts: 1383
Quote
that can have runtime for hours or even days
Such long running statements should not done over a connection. This kind of datagrabbing/analysing is more for a job on the server itself. A IP connection can be lost, changed, slow, .. by design and this is normal. 

BTW, here is no playground for ChatGPT answers. In other forums they are simply deleted, because normal unwanted.
regards
Andreas

MarkMLl

  • Hero Member
  • *****
  • Posts: 8388
I have a desktop application connecting to remote PostgreSQL server, using TPQConnection.
It happens that during very long-running queries connection is lost due to my dynamic IP address changed on router.
Is there something that can be done in the app to prevent this?
This sounds like a published port on the Internet, which is a Big NO-NO!
Use a WebService to handle your requests

That's not how I read it. My interpretation was that his server was somewhere else (e.g. at his office) but his home router changed IP addresses and ripped down existing connections whenever the DHCP lease was renewed or as the result of noise on the line. Those are, of course, issues which should be resolved by the ISP- or if necessary by changing ISP- but that's outside the scope of the question.

I think you're the Local Expert: does having DB-aware components active require a live connection? If so, then what can be done about it since that's not how SQL was intended to be used (with the exception of PostgreSQL notifications etc.).

MarkMLl
MT+86 & Turbo Pascal v1 on CCP/M-86, multitasking with LAN & graphics in 128Kb.
Logitech, TopSpeed & FTL Modula-2 on bare metal (Z80, '286 protected mode).
Pet hate: people who boast about the size and sophistication of their computer.
GitHub repositories: https://github.com/MarkMLl?tab=repositories

MarkMLl

  • Hero Member
  • *****
  • Posts: 8388
BTW, here is no playground for ChatGPT answers. In other forums they are simply deleted, because normal unwanted.

Apart from anything else, they are extremely undesirable in a forum to which people might be directed to by Google etc. when they're looking for reliable information.

I didn't read it in detail because it wasn't marked up and because I've got /rather/ a lot of other crap on my plate, but the impression I got was that it didn't even address the question that OP was asking: and that should have been obvious to him. Hence "playground" or "sandpit" is very much the correct way of describing it.

So quite frankly my preference would be that he deleted that posting (or at least removed all the ChatGPT stuff from it) at which point we can also remove these messages being critical of him.

MarkMLl
MT+86 & Turbo Pascal v1 on CCP/M-86, multitasking with LAN & graphics in 128Kb.
Logitech, TopSpeed & FTL Modula-2 on bare metal (Z80, '286 protected mode).
Pet hate: people who boast about the size and sophistication of their computer.
GitHub repositories: https://github.com/MarkMLl?tab=repositories

Zvoni

  • Hero Member
  • *****
  • Posts: 2963
That's not how I read it. My interpretation was that his server was somewhere else (e.g. at his office) but his home router changed IP addresses and ripped down existing connections whenever the DHCP lease was renewed or as the result of noise on the line. Those are, of course, issues which should be resolved by the ISP- or if necessary by changing ISP- but that's outside the scope of the question.
Hmm... OK, agreed. could be interpreted that way, that this "remote" server is on a LAN.
Now the question: Hast that server a static IP? (Which it should have!!).
OTOH: Changing IP's could/should be avoided using a properly set up DNS, and any connection should use hostnames


Quote
I think you're the Local Expert: does having DB-aware components active require a live connection? If so, then what can be done about it since that's not how SQL was intended to be used (with the exception of PostgreSQL notifications etc.).

MarkMLl
Thanks for that vote of confidence  :P
I don't consider myself an expert (local or otherwise), and i'm definitely the wrong person to ask about DB-bound controls, since i've never used them (exactly because of those "questions" you've posted).
That said: any DB-aware Control (DBGrid etc.) have a TDataSource which delegates to a TDataSet.
https://lazarus-ccr.sourceforge.io/docs/fcl/db/tdatasource.html
Quote
TDatasource is a mediating component: it handles communication between any DB-Aware component (often edit controls on a form) and a TDataset instance.
Logic dictates, that you don't need a "live" connection, since a TDataSet can be "anything" which provides a TDataSet (InMem-DB, JSONDataSet etc.), but don't have a connection
« Last Edit: July 17, 2023, 08:56:33 am by Zvoni »
One System to rule them all, One Code to find them,
One IDE to bring them all, and to the Framework bind them,
in the Land of Redmond, where the Windows lie
---------------------------------------------------------------------
Code is like a joke: If you have to explain it, it's bad

MarkMLl

  • Hero Member
  • *****
  • Posts: 8388
Hmm... OK, agreed. could be interpreted that way, that this "remote" server is on a LAN.
Now the question: Hast that server a static IP? (Which it should have!!).

Definitely.

Quote
OTOH: Changing IP's could/should be avoided using a properly set up DNS, and any connection should use hostnames

Partly agreed. Relying on a DNS server means that you've got two possible servers which could fail, not just one. In the end, I prefer to make a DNS query at the start of a program run, then to hold the IP address in numeric form for as long as is reasonable.

Quote
Thanks for that vote of confidence  :P
I don't consider myself an expert (local or otherwise), and i'm definitely the wrong person to ask about DB-bound controls, since i've never used them (exactly because of those "questions" you've posted).
That said: any DB-aware Control (DBGrid etc.) have a TDataSource which delegates to a TDataSet.
https://lazarus-ccr.sourceforge.io/docs/fcl/db/tdatasource.html
Quote
TDatasource is a mediating component: it handles communication between any DB-Aware component (often edit controls on a form) and a TDataset instance.
Logic dictates, that you don't need a "live" connection, since a TDataSet can be "anything" which provides a TDataSet (InMem-DB, JSONDataSet etc.), but don't have a connection

Hmm. OK, here's my position. I've got a large chunk of mission critical code talking to a local server which uses DB-aware controls, but this was something I originally wrote 20 years ago and I'm not happy about the situation for reasons already discussed.

My understanding is that DB-aware components require a live connection, and if that goes down they will at the very least become unresponsive (i.e. unable to handle redraw events etc. without error). I might be wrong in this or things might have moved on in 20 years, but I think I've seen discussion of the hoops one has to jump through to automatically reconnect and it wasn't a one-property fix.

SQL was originally designed to process a query, return a result, and to be tolerant of interruption when not actively processing a query or transaction. That was weakened slightly when cursors etc. were introduced which allowed results to be returned in chunks.

Every RAD that I've seen relies on being able to hold the connection open indefinitely. That is bad. Add to that that various DB-aware office utilities etc. go badly wrong if asked to inspect an unexpectedly-large table: that is also bad.

I put quite a lot of work into a testbed program a few years ago which doubled-up the DB connection and was able to use one to request a data update (I was using a local weather-station table as the experimental backend) and the other to display the last-known data. I can't remember exactly how well that worked, but in any event it was non-trivial. It also used PostgreSQL's notification facility (predating the LCL's support for this) to become aware of new data, and my opinion is that the only reason a connection should be kept open is for that type of asynchronous event.

So what's really needed- and I don't know where Lazarus stands on this- is a way of making a query, parsing it into a control without the assumption that there is still a live connection, and having comparable facilities to send updates back to the server. Hence your

Quote
Logic dictates, that you don't need a "live" connection, since a TDataSet can be "anything" which provides a TDataSet (InMem-DB, JSONDataSet etc.), but don't have a connection

Can a local dataset cache the result of a remote query automatically and reliably?

Which I think echos OP's question.

MarkMLl
MT+86 & Turbo Pascal v1 on CCP/M-86, multitasking with LAN & graphics in 128Kb.
Logitech, TopSpeed & FTL Modula-2 on bare metal (Z80, '286 protected mode).
Pet hate: people who boast about the size and sophistication of their computer.
GitHub repositories: https://github.com/MarkMLl?tab=repositories

Чебурашка

  • Hero Member
  • *****
  • Posts: 588
  • СЛАВА УКРАЇНІ! / Slava Ukraïni!
Let me share some replies from chatGPT.

So now forum should become a "Chat GPT" validation/explanation place?

Better if Chat GPT or other things are seen as real sources of information and, as such, their answers' understandability/comprehensiveness should be in charge of them. I.e. once you ask to them, and you do not usnderstand or it does not work please refer to them, not to the forums.
FPC 3.2.0/Lazarus 2.0.10+dfsg-4+b2 on Debian 11.5
FPC 3.2.2/Lazarus 2.2.0 on Windows 10 Pro 21H2

 

TinyPortal © 2005-2018