OK, let me say up front that I am aware that these kind of less concrete request are difficult to respond to in these fora vs. the ones that can deal directly with code, but I am seeking experiences from those who have made similar efforts in the past, and if there perhaps in hind sight was learning experiences that would have lead to a different strategy:
Introduction: I am in the early process of converting a data acquisition application written in TP 5.5 that I have developed and maintained over more than 30 years up to current days, and that I and my colleagues rely heavily upon for our research. Some work with respect to establishing new file format and translation of old parameter files has already been done (I chose the .ini file format, building upon the IniFiles unit, and expanded and specialized for my purpose. One of the big priorities is stable file formats that can be expanded upon and be able to provide backwards compatibility though import of old binary parameter files). Since then there has been a lapse in the efforts, which I am now funded to continue. I am concerned about maintaining stability, and integrity of this very well tested code with respect to correctness of calculations. The data file format for slow sampling is simple csv-tab-delimited which I see no reason to change except for UTF-8 compatible variable headings and units. The code is about 90,000 program lines so this is not a small task and is going to take a lot of time.
I wrote my own text based and graph based menu systems in TP5.5, so it is not based on any third party user interface libraries. The graphing part is based on a subset of the Turbo Halo library that provides basic graphic primitives (including an excellent world coordinate system that allows mapping to real coordinates of a plot). However code for graphing will have to be replaced, possibly using TAChart, unless I find some obstacles with respect to special features that necessitates translating my old code.
The program collects data and controls hardware (mostly though serial port these days, but also old support for a Labmaster board and parallel ports), will scale data, perform specialized calibrations and calculations in my field of expertise, and plots data in real time. In addition there are a number of routines for specialized imports and data manipulation. A challenge is some overlap in the use of parameters between these functionalities.
A main point that has caused hesitation is to what degree to "objectify" (i. e create manageable classes) of the code, and at what level of detail. The current code is almost exclusively procedural, but in many cases passing a large number of parameters to the procedures which in some respect can be called an object oriented philosophy of providing an interface. However the parameters for scaling and calibrations, very extensive calculations, and graphing is stored in global dynamic records. I am pretty sure I will have to use multi threading for the data acquisition part. The main data are stored in complicated dynamically allocated array structures (to overcome 64K limitations of DOS memory), also global, that I think can be simplified by using a dynamic array with step wise allocation of new chunks of "cells" as before. (But how to access it in an object oriented way with so many parts of the program manipulating and using it?)
My current idea is to wrap pretty large portions of code in classes according to functionality that contain its own parameters in each of these areas, and and that each class will also have the code to call the ini file code to save the parameters to a common ini-file (completely rewriting the inifile each time) that goes with the tab delimited text file.
However then I read an old comment by Marco Cantu (which I unfortunately cannot find at the moment as it is a while ago) where he recommended to keep as much code as possible in its original procedural form when converting a DOS program - in other words not go very far into creating classes of it.
At this early stage I am concerned about not working myself into a corner, as I envision this application to be usable and maintainable long into the future. So again, are there any experiences to take along on the way?
I have published a couple of papers relating to some functional aspects of the code if anyone get
very interested:
Tøien, Ø. (1992) Data acquisition in thermal physiology: measurements of shivering. J. Therm. Biol. 17:357-366.
Tøien, Ø. (2013) Automated open flow respirometry in continuous and long-term measurements: design and principles. J. Appl. Physiol. 114:1094-1107