Recent

Author Topic: Sharing same source file in two projects  (Read 9406 times)

alpine

  • Hero Member
  • *****
  • Posts: 1038
Re: Sharing same source file in two projects
« Reply #30 on: October 05, 2021, 12:48:59 pm »
*snip*
I don't understand why this is less practical than the current solution with two different build buttons? What I proposed was just a simple logic that automatically decides which of the two buttons to "press".

* Options didn't change, so "compile" is sufficient (let FPC just look at timestamps of each unit)
* Options changed, so "build" is needed

Nothing else. Especially I was not talking about rebuilding FCL packges, they have their own optios anyways and don't need rebuilding when the compile options in your app have changed.
*snip*
As a matter of fact, the C/C++ IDEs I've worked with, trigger a "re-build" (which is the other name for "build" in Laz) on any change in project properties, incl. options, defines, etc.

They don't hide the "build" command ("compile" in Laz) just mark the change and the next "build" becomes "re-build". I'm talking here about IDEs with internal build system, not something layered on the top of the make program.   
"I'm sorry Dave, I'm afraid I can't do that."
—HAL 9000

ccrause

  • Hero Member
  • *****
  • Posts: 845
Re: Sharing same source file in two projects
« Reply #31 on: October 05, 2021, 02:56:03 pm »
*snip*
I don't understand why this is less practical than the current solution with two different build buttons? What I proposed was just a simple logic that automatically decides which of the two buttons to "press".

* Options didn't change, so "compile" is sufficient (let FPC just look at timestamps of each unit)
* Options changed, so "build" is needed

Nothing else. Especially I was not talking about rebuilding FCL packges, they have their own optios anyways and don't need rebuilding when the compile options in your app have changed.
*snip*

As a matter of fact, the C/C++ IDEs I've worked with, trigger a "re-build" (which is the other name for "build" in Laz) on any change in project properties, incl. options, defines, etc.

They don't hide the "build" command ("compile" in Laz) just mark the change and the next "build" becomes "re-build". I'm talking here about IDEs with internal build system, not something layered on the top of the make program.
Lazarus already does some tracking of project level defines.  The example I concocted earlier was handled without problems by Lazarus, even if I changed the define in the project options and just pressed run.  If the project level define changed, the precompiled unit was recompiled, even though its source wasn't touched. This worked in Lazarus until I compiled the project from the command line with a command line define.  Then Lazarus (obviously) lost track of the define used when the unit previously was compiled and the unit and main program lost sync on which define was active where.

Thus in my brief experiment with Lazarus 2.0.12 there appear to be support to track project defines and rebuild when necessary.  The question then is when does Lazarus loose track of a changed define.  When a define is in the project custom options section, or elsewhere in an include file, a compiler config file or some other combination of factors?

MarkMLl

  • Hero Member
  • *****
  • Posts: 6676
Re: Sharing same source file in two projects
« Reply #32 on: October 05, 2021, 04:05:07 pm »
In the case which first attracted my ire it was with this:

Code: Pascal  [Select][+][-]
  1. program PythonDemo;
  2.  
  3. {$mode objfpc}{$H+}
  4.  
  5. {$define DYNAMIC }
  6. {$define PYTHON2 }
  7.  
  8. (* Assume that there is only superficial similarity between the Python v2 and   *)
  9. (* v3 script syntax and semantics. This demo is, more than anything else, the   *)
  10. (* "point of contact" between the various implementation possibilities at the   *)
  11. (* time of development.                                                         *)
  12.  
  13. {$ifdef PYTHON2 }
  14. {$unitpath ./python2 }
  15. {$note ===== Intending to link to Python v2 ===== }
  16. {$else          }
  17. {$unitpath ./python3 }
  18. {$note ===== Intending to link to Python v3 ===== }
  19. {$endif PYTHON2 }
  20.  
  21. (* WARNING: since the unit path is being set up during compilation, Lazarus's   *)
  22. (* build facility might not successfully retire unwanted files. If in doubt do  *)
  23. (* a "Clean directory".                                                         *)
  24.  
  25. uses
  26.   {$IFDEF UNIX}
  27.   cthreads,
  28.   {$ENDIF}
  29.   Classes,
  30. {$ifdef DYNAMIC }
  31.   python_dynamic;                       (* "Python" is a shared-library object  *)
  32. {$note ===== Intending to link dynamically ===== }
  33. {$else          }
  34.   python;                               (* "Python" is a unit file              *)
  35. {$note ===== Intending to link statically ===== }
  36. {$endif DYNAMIC }
  37.  
  38. {$macro on}
  39. {$define NL:= LineEnding }
  40.  
  41. var
  42.   localArgv: APChar;                    (* For PySys_SetArgv()                  *)
  43.   debugging: boolean= false;
  44.  
  45. begin
  46.   InitializeLibrary;                    (* Ensures Python.ModuleInMemory valid  *)
  47.  
  48. (* Report on the state of the interface unit or object as early as possible.    *)
  49.  
  50. {$push } {$warnings off   Suppress warnings about unreachable code }
  51.   if Python.IsDynamic then
  52.     Write('Python is dynamically linked, ')
  53.   else
  54.     Write('Python is statically linked, ');
  55.   if Python.ModuleInMemory then
  56.     Write('and has ')
  57.   else
  58.     Write('but has not ');
  59.   WriteLn('been successfully loaded.');
  60. {$pop }
  61.  
  62. (* InitLocalArgv() is a convenience routine inside the Python unit or object,   *)
  63. (* but does not call into libpython.                                            *)
  64.  
  65. {$push } {$hints off   Suppress spurious uninitialised managed variable hint }
  66.   Python.InitLocalArgv(localArgv, debugging);   (* localArgv and debugging flag *)
  67. {$pop }
  68.   if debugging then
  69.     WriteLn('Debugging is enabled.');   (* Special --DEBUG-- option             *)
  70.  
  71. (* Report Python version etc., this calls into libpython but does not rely on   *)
  72. (* script execution. Note that while Py_SetProgramName() has to be called       *)
  73. (* before Py_Initialize in order to set up internal paths etc., PySys_SetArgv() *)
  74. (* apparently has to be called afterwards.                                      *)
  75.  
  76. {$push } {$notes off   Suppress note about possible failure to inline StrPas() }
  77.   WriteLn(StrPas(Python.Py_GetVersion()));
  78. {$pop }
  79.   Python.Py_SetProgramName(PChar(ParamStr(0))); (* Before _Initialize           *)
  80.  
  81. (* Initialise the embedded Python implementation, test that functions requiring *)
  82. (* a variable number of parameters work properly, and run the traditional       *)
  83. (* trivial script. For the moment at least, expect output to go to the console. *)
  84.  
  85.   Python.Py_Initialize;
  86. {$if declared(HasLoadVarargsRoutine) }
  87.   Python.LoadVarargsRoutine('*', true); (* Try to load all varargs routines     *)
  88. {$endif }
  89. {$ifdef DYNAMIC }
  90.   Assert(Assigned(Python.PySys_WriteStdout) and Assigned(Python.PySys_WriteStderr),
  91.                                         'Dynamic varargs load error');
  92. {$endif DYNAMIC }
  93.   Python.PySys_WriteStdout('%s %s' + NL, ['Test message via', 'PySys_WriteStdout()']);
  94.   Python.PySys_WriteStderr('%s %s' + NL, ['Test message via', 'PySys_WriteStderr()']);
  95.   try
  96. //    Py_InitModule("emb", EmbMethods);   // emb would be part of this Pascal program
  97.     Python.PySys_SetArgv(Length(localArgv), @localArgv[0]);
  98.     Python.PyRun_SimpleString(
  99.         'import sys' + NL +             (* Just to check EOL convention         *)
  100.         'print(sys.argv[0:])' + NL +
  101.         'print("Hello, World!")' + NL
  102.     );
  103.   finally
  104.     Python.Py_Finalize
  105.   end
  106. end.
  107.  

i.e. with defines in the main program file. It was put to me at the time that unitpath was problematic... and that includes were problematic... so what more should I expect... which was one of the reasons that I didn't pursue it further once I had a workaround.

MarkMLl
MT+86 & Turbo Pascal v1 on CCP/M-86, multitasking with LAN & graphics in 128Kb.
Pet hate: people who boast about the size and sophistication of their computer.
GitHub repositories: https://github.com/MarkMLl?tab=repositories

JuhaManninen

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 4459
  • I like bugs.
Re: Sharing same source file in two projects
« Reply #33 on: October 05, 2021, 07:45:29 pm »
y.ivanov, I would refactor the code so that IFDEFs are not needed. It typically makes the design cleaner and easier to maintain.
You would avoid the problems you have now. Pascal compilation can track unit dependencies well and doesn't compile uselessly as often as C++.
Keep DEFINEs and IFDEFs for OS dependent code only and they won't cause trouble.

In practice make a base class and derived classes for console and GUI code, or use events and assign different handlers for console and GUI.
Mostly Lazarus trunk and FPC 3.2 on Manjaro Linux 64-bit.

alpine

  • Hero Member
  • *****
  • Posts: 1038
Re: Sharing same source file in two projects
« Reply #34 on: October 06, 2021, 11:26:11 am »
y.ivanov, I would refactor the code so that IFDEFs are not needed. It typically makes the design cleaner and easier to maintain.
You would avoid the problems you have now. Pascal compilation can track unit dependencies well and doesn't compile uselessly as often as C++.
Keep DEFINEs and IFDEFs for OS dependent code only and they won't cause trouble.

In practice make a base class and derived classes for console and GUI code, or use events and assign different handlers for console and GUI.
While I'm appreciating your advice, I wouldn't follow it in my case, which is as simple as that:
Code: Pascal  [Select][+][-]
  1.      ...
  2.     {$IFDEF WINDOWS}
  3.       {$IFDEF WINSVC}
  4.         // If it is a windows service sleep until terminated by service manager
  5.         repeat
  6.           Sleep(1000);
  7.         until ServerTerminated;
  8.       {$ELSE}
  9.         // If it is a console program then wait for the enter key
  10.         ReadLn; // <----- It accidentally gets linked with WINSVC defined !!!
  11.       {$ENDIF}
  12.     {$ELSE}
  13.      // If it is a Linux program terminate by ^C
  14.       SigIntFlag := False;
  15.       FpSignal(SIGINT, @DoSigInt);
  16.       while not SigIntFlag do
  17.         Sleep(1000);
  18.     {$ENDIF}    
Anyway, putting different intermediate directories for the build modes is the ultimate workaround for this.

But the reason I've posted here was that I was completely unaware of how easily this could happen. The test case of ccrause (Reply #24) is quite disturbing, and IMHO the compile dependencies should have been extended at least for the project level defines.

"I'm sorry Dave, I'm afraid I can't do that."
—HAL 9000

marcov

  • Administrator
  • Hero Member
  • *
  • Posts: 11382
  • FPC developer.
Re: Sharing same source file in two projects
« Reply #35 on: October 06, 2021, 11:41:44 am »
Note that an IDE based solution to simply detect changes in the project configuration doesn't fix the OP's problem where the defines problem is in a shared unit.

Such IDE feature would be ok to have (and afaik already is for some cases, since if I recompile FPC, it seems to rebuild LCL too), but the IDE has no complete overview of all units accessed by a process, the compiler determines this while running.

You can of course expand the IDE parser to do that, but it would also to access ppu's to see with what defines the ppu was last compiled. (and probably that would have to be strengthened)
« Last Edit: October 06, 2021, 11:50:22 am by marcov »

alpine

  • Hero Member
  • *****
  • Posts: 1038
Re: Sharing same source file in two projects
« Reply #36 on: October 06, 2021, 04:04:48 pm »
Note that an IDE based solution to simply detect changes in the project configuration doesn't fix the OP's problem where the defines problem is in a shared unit.

Such IDE feature would be ok to have (and afaik already is for some cases, since if I recompile FPC, it seems to rebuild LCL too), but the IDE has no complete overview of all units accessed by a process, the compiler determines this while running.

You can of course expand the IDE parser to do that, but it would also to access ppu's to see with what defines the ppu was last compiled. (and probably that would have to be strengthened)
Please, excuse my ignorance, but Isn't it the IDE that just prepares the arguments for the CLI fpc compiler and then executes it? Why we're talking about separate cases, IDE vs compiler, in the context of the 'compile' process? 

Grep-ing into the compiler sources (not the most recent ones, though) I can see some {$IFDEF MACRO_DIFF_HINT} directives enclosing macro processing code in fppu.pas with accompanying comments like:
Code: Pascal  [Select][+][-]
  1. {$IFDEF MACRO_DIFF_HINT}
  2.  
  3. {
  4.   Define MACRO_DIFF_HINT for the whole compiler (and ppudump)
  5.   to turn this facility on. Also the hint messages defined
  6.   below must be commented in in the msg/errore.msg file.
  7.  
  8.   There is some problems with this, thats why it is shut off:
  9.   ...
  10.  
Which makes me guess someone is working on such a feature, but it looks like it is not workable yet.
"I'm sorry Dave, I'm afraid I can't do that."
—HAL 9000

marcov

  • Administrator
  • Hero Member
  • *
  • Posts: 11382
  • FPC developer.
Re: Sharing same source file in two projects
« Reply #37 on: October 06, 2021, 04:54:37 pm »
Please, excuse my ignorance, but Isn't it the IDE that just prepares the arguments for the CLI fpc compiler and then executes it? Why we're talking about separate cases, IDE vs compiler, in the context of the 'compile' process? 

We are not talking about separate cases. But that is _exactly_ why the IDE can't decide to rebuild all for the OP's problem, namely that one or several units USESd by the current project has also been compiled as part of a different project with possibly different settings.

I assume you know that FPC (or lazarus) doesn't require a full picture of what files are involved in a project before starting to compile. The list of object files to links is created dynamically during the compile, and the compiler also finds all compilation units itself. (contrary basic level C/C++)

So the commandline will be something like

Quote
  fpc -Mobjfpc -dxxx -dyyyy -Fu/a/b/c/libdir1 -Fu/a/b/c/libdir2 -Fi/a/b/c/incdir1   a.lpr

But before issueing, it doesn't know (or at least not with a _lot_ of research) that a.lpr depends on unit3 in dir /a/b/c/libdir1.

That unit3 unit is also part of b.lpr project that you just compiled three hours ago with different defines and that it should therefore force a build. So the set of settings that unit3 was last compiled with (with b.lpr) is thus different then the last compile of a.lpr.

As said a possible way out would be to parse the project completely, build a list of files, and then look in all corresponding .ppu with what settings that unit was last compiled with. But there are also problems with that, since not all defines are fixated on the commandline.

This is a bit more convoluted case (but not unthinkable, think of all the projects that use an includefile for project settings)

E.g. take  a.lpr project that includes directory  a/b/c/incdir1 (-Fi) for include files and b.lpr project that includes directory a/b/c/incdir2.

Now unit3 that is used by both has a {$i } include file near the top that sets some defines. These are not startup defines, thus not listed in the ppu.

This is very hard, if not impossible to detect in the current situation. (as an heuristic maybe record the last build of a.lpr timestamp, and scan if all .ppu's are older than that timestamp? Not air tight as partial builds e.g. due to compiler error somewhere might cause problems.)

Quote
Grep-ing into the compiler sources (not the most recent ones, though) I can see some {$IFDEF MACRO_DIFF_HINT} directives enclosing macro processing code in fppu.pas with accompanying comments like:

Keep in mind that a hint is the lowest form of compiler feedback. It is a hint that something might be happening, but not air tight enough to be a warning or error.

But all this depends on your problem definition. Rebuilding anything automated reliable for all scenarios that might happen is as I showed very hard and requires continuous costly analysis, or simply always a build instead of an incremental.

If you just want an heuristic so that an occasional project edit in the IDE propagates, to lessen the number of mistakes, I'm all for it. (but as sb said, that apparently already exists in some form or the other)

Making that difference is what I wanted to trigger with the question for a problem definition.

I myself use the above include trick in all my work sources (general units parameterised with an include file which is project specific) , so for me these solutions will never work anyway, I'll simply have to force builds in some cases. But even then, any low hanging fruit where things go automatically would be great. Really heavy handed solutions are less interesting, since then I could simply force a build always and be done with it. IOW the cure mustn't be worse than the problem :-)

alpine

  • Hero Member
  • *****
  • Posts: 1038
Re: Sharing same source file in two projects
« Reply #38 on: October 06, 2021, 07:56:43 pm »
The problem doesn't seems to me as complicated as it is presented.  O:-)

After a build for each unit source file there is a corresponding ppu file. In that ppu there are entries for all source files and units it depends. To decide whether to re-compile or not, we should check whether there is any of the listed files/unit on which it depends on has been changed. Usually we do that by comparing the current timestamps of the actual files against the saved timestamps into the ppu. If there is a change, then we must recompile the current unit.

Let's say we have an additional entry into the ppu, containing the command-line defines at the time of compilation. Now we can compare if it matches with the current set of defines. If isn't - we must re-compile again.

I can't imagine from what other place an additional macro definition can appear, besides:
  • From the file included with the {$I file} directive - since the file will be listed into the ppu as one on which the current depends on, the file timestamp will be newer
  • From the source file itself, {$DEFINE xxx} - it is also listed at the topmost position into the ppu, the file timestamp will be newer
  • From the command-line - we can compare with the previous set of defines
Thus, I can't see the reason of multi-level research through the dep-tree and corresponding ppu's.

We are not talking about separate cases. But that is _exactly_ why the IDE can't decide to rebuild all for the OP's problem, namely that one or several units USESd by the current project has also been compiled as part of a different project with possibly different settings.
Claiming up on the dependency tree (I'm assuming the leafs should be examined first) we can detect the change in the command-line defines for the shared unit and recompile it, even its source has not changed.

I assume you know that FPC (or lazarus) doesn't require a full picture of what files are involved in a project before starting to compile. The list of object files to links is created dynamically during the compile, and the compiler also finds all compilation units itself. (contrary basic level C/C++)

So the commandline will be something like

Quote
  fpc -Mobjfpc -dxxx -dyyyy -Fu/a/b/c/libdir1 -Fu/a/b/c/libdir2 -Fi/a/b/c/incdir1   a.lpr

But before issueing, it doesn't know (or at least not with a _lot_ of research) that a.lpr depends on unit3 in dir /a/b/c/libdir1.

That unit3 unit is also part of b.lpr project that you just compiled three hours ago with different defines and that it should therefore force a build. So the set of settings that unit3 was last compiled with (with b.lpr) is thus different then the last compile of a.lpr.
There will be different defines for a.lpr, thus unit3 will be recompiled when the change in defines detected.

As said a possible way out would be to parse the project completely, build a list of files, and then look in all corresponding .ppu with what settings that unit was last compiled with. But there are also problems with that, since not all defines are fixated on the commandline.
The inline defines will be changed in some included source file, their list is kept into the ppu and their timestamps will be compared.

This is a bit more convoluted case (but not unthinkable, think of all the projects that use an includefile for project settings)

E.g. take  a.lpr project that includes directory  a/b/c/incdir1 (-Fi) for include files and b.lpr project that includes directory a/b/c/incdir2.

Now unit3 that is used by both has a {$i } include file near the top that sets some defines. These are not startup defines, thus not listed in the ppu.
That included file will be listed into unit3.ppu and it's timestamp will be checked.

Put in other words, can't the command-line definitions (which I presume are same as the Laz project defines) be treated as saved in a temporary file and each source to have a dependency on it?

Maybe a simple hash on the command line stored into the ppu could do that job just for detecting the change.
"I'm sorry Dave, I'm afraid I can't do that."
—HAL 9000

prof7bit

  • Full Member
  • ***
  • Posts: 161
Re: Sharing same source file in two projects
« Reply #39 on: October 06, 2021, 08:17:02 pm »
The quickest way would be to just store a hash over the compiler command line in the ppu and recompile if it changes. This would still cause some unneeded recompilation if a unit does not actually use a certain define but it would at least prevent units from NOT being recompiled if they really should.

The more complicated approach would be to store all compiler options individually in the ppu and mark those that did not affect the compilation of that particular unit as "DontCare". This would be a whole bunch of code, some of it in the depths of the compiler.

The hash would be much easier to implement, it would produce false positives (better safe than sorry) but it would catch OPs use case and fix his problem.

But it would also change the PPU format and probably also force other tools that parse it to make some changes or additions.
« Last Edit: October 06, 2021, 08:18:42 pm by prof7bit »

marcov

  • Administrator
  • Hero Member
  • *
  • Posts: 11382
  • FPC developer.
Re: Sharing same source file in two projects
« Reply #40 on: October 07, 2021, 12:02:51 pm »
The problem doesn't seems to me as complicated as it is presented.  O:-)

It depends. If you have complicated solutions (building file list, reading PPU state, tracking everything), you want the result to be pretty airtight. If you want stuff built into the compiler, you want it to be really airtight.

I'm getting mixed signals about changing the compiler algorithm vs the IDE "decide compile or build" algorithm. The former is complicated, as the compiler doesn't exclusively cater as much to the "compile the whole project in one go with one commandline" as the lazarus ide does. And even Lazarus doesn't 100%.

So what are you talking about, extending codetools to make better decisions about compile or build, or changing the compiler?

Quote
After a build for each unit source file there is a corresponding ppu file. In that ppu there are entries for all source files and units it depends.

It _directly_ depends, and only if the last build was successful. If it was unsuccessful, a PPU might have been deleted, breaking the chain

Quote
To decide whether to re-compile or not, we should check whether there is any of the listed files/unit on which it depends on has been changed.

There is no list. Maybe codetools have something but not sure if that is complete.

Quote
Usually we do that by comparing the current timestamps of the actual files against the saved timestamps into the ppu. If there is a change, then we must recompile the current unit.

That's what the compiler already does automatically. But that doesn't take defines into consideration.

Quote
Let's say we have an additional entry into the ppu, containing the command-line defines at the time of compilation.

Assuming it is built in one go. Lazarus can have projects depend on packages, and might auto rebuild them too creating a different cmdline for those units (from the project .lpk of that package).

Quote
Now we can compare if it matches with the current set of defines. If isn't - we must re-compile again.

  • From the file included with the {$I file} directive - since the file will be listed into the ppu as one on which the current depends on, the file timestamp will be newer

No, it might be older. The other (b.lpr in my example iirc) project has a different but older include file.  The .ppu does not list paths, only names.

Quote
  • From the source file itself, {$DEFINE xxx} - it is also listed at the topmost position into the ppu, the file timestamp will be newer

Locally defines are afaik not in the ppu. Only the ones on source file compilation entry.

Quote
Thus, I can't see the reason of multi-level research through the dep-tree and corresponding ppu's.

You don't have a complete list of files in the IDE. In the compiler you can do more, but that also poses problems as where to stop ? The compiler doesn't know any package bounderies including lazarus packages and precompiled FPC units. It only knows ppus.  The only when it recursively stops the ppu loading is when it encounters .ppu's compiled with -Ur.  (FPC release .ppu, not sure if e.g. fpcdeluxe enables this though)

As already mentioned, to tackle this, you could make a -Ur multi level (system, package, application) and give some parameter to relax the rebuild algorithm at some package boundary.

But that is all quite massive, and I think there will be many practical problems with define checking. But the only way to find out is to try.

Quote
Claiming up on the dependency tree (I'm assuming the leafs should be examined first) we can detect the change in the command-line defines for the shared unit and recompile it, even its source has not changed.

What dependency tree?  As said, there is only executing the compiler on the first module, nothing more.

You can execute the compiler or resort to ppu/source scanning in the IDE, nothing else. There is no makefile with all items listed that can be transformed into a tree.

Quote
There will be different defines for a.lpr, thus unit3 will be recompiled when the change in defines detected.

That that is not true was the point of the exercise. The defines are in an includefile and thus not part of the commandline (and so not in the .ppu), so invisible to your proposed plan.  So no, that doesn't work. You might maybe able to detect something fishy though because the includefile datetimestamp changed. (either while compiling or when tracing .ppu's in the IDE)

Quote
The inline defines will be changed in some included source file, their list is kept into the ppu and their timestamps will be compared.

Inline defines are not listed in the ppu, since they are not global to the ppu. Moreover, even if it did, that would break your algorithm, since then local unit defines would be in the .ppu and always mismatch your commandline.

Quote
This is a bit more convoluted case (but not unthinkable, think of all the projects that use an includefile for project settings)

E.g. take  a.lpr project that includes directory  a/b/c/incdir1 (-Fi) for include files and b.lpr project that includes directory a/b/c/incdir2.

That included file will be listed into unit3.ppu and it's timestamp will be checked.

It will, but it won't be necessarily newer.

Quote
Put in other words, can't the command-line definitions (which I presume are same as the Laz project defines) be treated as saved in a temporary file and each source to have a dependency on it?

No. As said there is no global list to create some dependency on it.

Quote
Maybe a simple hash on the command line stored into the ppu could do that job just for detecting the change.

The problem in leaving it to the compiler is when to stop. Even for the IDE the problem is what to do if it can't find a piece of source or ppu. Is that a missing file in the project, or does it belong to preinstalled packages? Quite involved to figure all that out.

Anyway, we are approaching the borders of my superficial knowledge of the compiler. The first thing to do is to decide approach. Change the compiler or change the IDE ?
« Last Edit: October 07, 2021, 12:26:30 pm by marcov »

marcov

  • Administrator
  • Hero Member
  • *
  • Posts: 11382
  • FPC developer.
Re: Sharing same source file in two projects
« Reply #41 on: October 07, 2021, 12:09:50 pm »
The quickest way would be to just store a hash over the compiler command line in the ppu and recompile if it changes. This would still cause some unneeded recompilation if a unit does not actually use a certain define but it would at least prevent units from NOT being recompiled if they really should.

See the compiler halting problem described in the post to y.ivanov. The compiler only knows .ppus. It doesn't know which belong to project, lazarus package or FPC package.

Sooner or later it will load a FPC or lazarus unit that wasn't compiled with the exact parameters.

Quote
The more complicated approach would be to store all compiler options individually in the ppu and mark those that did not affect the compilation of that particular unit as "DontCare". This would be a whole bunch of code, some of it in the depths of the compiler.

This precludes any source precompiled from outside the project. As said multiple times, many pieces of code (e.g. lazarus packages and the fpc packages) need extra defines and options, so that "global one compile" model is not viable.

Quote
But it would also change the PPU format and probably also force other tools that parse it to make some changes or additions.

I think it would be possible, but you would have to flag all preexisting non (project .ppus in a special way, so the compiler knows the hash match doesn't need to be performed. IOW, define a project as a leaf and compile the rest as branch and tell the compiler only apply it to units that are a leaf.  Maybe even compile some as trunk (the current -Ur)

But as all of this is quite invasive and reaches across the whole codebase, the question is how practical all this would be .

Moreover it relies on having a reliable classification as leaf/branch/trunk and it might only shift the problem.

alpine

  • Hero Member
  • *****
  • Posts: 1038
Re: Sharing same source file in two projects
« Reply #42 on: October 07, 2021, 01:18:46 pm »
@markov,
What I can see now is that maybe I've failed to explain my proposal clearly. It is basically what prof7bit wrote into his the reply #39, i.e. to put into the ppu some footprint of the actual command-line at the time of compilation. Either as a hash or as a string, etc. - whatever is more convinient - and then compare against it.

Anyway, despite my bad explanation, now I can see the actual problems you're writing about:
The .ppu does not list paths, only names.
Is the first trouble. And then:
In the compiler you can do more, but that also poses problems as where to stop ? The compiler doesn't know any package bounderies including lazarus packages and precompiled FPC units. It only knows ppus.  The only when it recursively stops the ppu loading is when it encounters .ppu's compiled with -Ur.  (FPC release .ppu, not sure if e.g. fpcdeluxe enables this though)
Is probably the actual stopper.

And just to clarify what I've meant by dep-tree (It is not relevant anymore)
Quote
Claiming up on the dependency tree (I'm assuming the leafs should be examined first) we can detect the change in the command-line defines for the shared unit and recompile it, even its source has not changed.
What dependency tree?  As said, there is only executing the compiler on the first module, nothing more.
When compiler executed on the top module, it starts checking dependencies, walking through the used units. ppu files. For each such .ppu file in turn, it's own dependencies has to be checked (recursively) until a leaf .ppu found. Thus, the compiler walks a dep-tree which is defined as visited .ppu files as nodes. 
That is the dependency tree I'm talking about, it is not contained in a single file, it is a multi .ppu file structure. But it's there :)
"I'm sorry Dave, I'm afraid I can't do that."
—HAL 9000

marcov

  • Administrator
  • Hero Member
  • *
  • Posts: 11382
  • FPC developer.
Re: Sharing same source file in two projects
« Reply #43 on: October 07, 2021, 01:57:52 pm »
Is probably the actual stopper.

Not absolute, but things like this make it very complicated, not just compiler algorithm, but all buildsystems must be made aware (and that assumes my hunch even works out in practice) And then also the cost/benefit factor comes into play.

Dep-tree: that's what I meant. The compiler traverses recursively, and that can be seen as a tree pattern, but that is not the same as first building a tree and then traversing it. Also compiler unit building order is not entirely predictable, at least not intuitively.

Anyway, the discussion is still good, also for me as core member but compiler-outsider. Keeps me on my toes :-)


alpine

  • Hero Member
  • *****
  • Posts: 1038
Re: Sharing same source file in two projects
« Reply #44 on: October 07, 2021, 09:17:33 pm »
@marcov,
What if the suggested check is performed only on the .ppu's residing into intermediate directory? The compiler should know that directory by its command-line parameters, I believe.
"I'm sorry Dave, I'm afraid I can't do that."
—HAL 9000

 

TinyPortal © 2005-2018