Recent

Author Topic: Considerations in the decision of adding a feature to the FPC compiler?  (Read 972 times)

marcov

  • Global Moderator
  • Hero Member
  • *****
  • Posts: 7071
What Leledumbo says, and that also illustrates that one should be very careful with borrowing features. What it ends up being might be different from how it is in a different language.

The classic example is GC. People compare languages with movable, generational GCs designed into the core, and then say "Yes, but you can also whack a GC under C, so it should also be possible for Pascal". NO, not the SAME  GC :_)

Anyway, radical changes are not considered AT ALL anyway.

And personally, I think the undirected drive to modernization is just a pretext for forcing random language experimentation into a project where it doesn't belong.
« Last Edit: June 12, 2019, 08:43:11 am by marcov »

PascalDragon

  • Sr. Member
  • ****
  • Posts: 352
  • Compiler Developer
Quote
Is there some somewhat "formal"/"structured" process that leads to the decision of including/excluding such features ?

No. If a committer wants to commit them, there is sometimes discussion on maillists. Unfortunately IMHO it gets accepted way too often with the argument "you don't have to use it if you don't like it".
I think the recent discussions on fpc-devel showed that we're still not accepting everything and the kitchen sink. ;)

In my opinion the feature list is already too long. It's nearly impossible to know all the possibilities, the syntax options and all side effects of the various features of fpc.
So instead of introducing new features I would suggest to do some tidy up.
E.g.
- The writable typed constant concept as local static variable is counter intuitive, perhaps constants should never be variables.
- Classes, objects, advanced records. Thats one to much.
- Modula-2 syntax for blocks: if ... then ... end; for... do ... end;
- Passing constant values to var parameters should be possible. (Well, this would be backward compatible)
- Parameters as var, out, const, constref... perhaps this could be done better.
- Volatility (I know it was introduced, but as I understand if the compiler really builds on it, this would break existing code)

These are just examples, don't need to discuss them. I also know that there are very good reasons why it is done the way it currently is. Furthermore, breaking backward compatibility is a huge step.
No. Just no. Both compatibility to TP/Delphi and backwards compatibility are important core concepts of FPC (though the Delphi compatibility got quite a chink with Delphi introducing inline variables %) ) and thus we won't restructure core concepts only because some think that it's too much. Those people can always start a new compiler, cause that's what it would be essentially.

SymbolicFrank

  • Sr. Member
  • ****
  • Posts: 439
Half my projects end up written in a different language, simply because the lack of support for the library or target. Like bigint, iOS or embedded. Mac is still not complete. And the installation still fails half the time (especially on Linux).

Ok, not as sexy as adding stuff to the environment you actively use to code, but if you want to grow support for the language, the most important feature is that you can start coding your new project right away. If it takes days or weeks of effort before you can start, most people take an alternative development platform. And the few who do make the effort end up as part of the dev team.

Leledumbo

  • Hero Member
  • *****
  • Posts: 8074
  • Programming + Glam Metal + Tae Kwon Do = Me
but, I think of Nim as a macro language itself since it doesn't compile to native code, it produces C. 
If you take a look at the generated C code, or how it's generated, it's far from other compilers that transpiles to C, where the C code is usually still human readable. The author treats C like an abstract architecture, hence the generated C code looks more assembly-ish than C converted code.

440bx

  • Hero Member
  • *****
  • Posts: 826
but, I think of Nim as a macro language itself since it doesn't compile to native code, it produces C. 
If you take a look at the generated C code, or how it's generated, it's far from other compilers that transpiles to C, where the C code is usually still human readable. The author treats C like an abstract architecture, hence the generated C code looks more assembly-ish than C converted code.
I'm not familiar enough with Nim to evaluate it properly but, the fact that a separate compiler is involved in the process of getting executable code makes me uncomfortable.  That additional compiler is, at least potentially, another source of problems.
using FPC v3.0.4 and Lazarus 1.8.2 on Windows 7 64bit.

Leledumbo

  • Hero Member
  • *****
  • Posts: 8074
  • Programming + Glam Metal + Tae Kwon Do = Me
I'm not familiar enough with Nim to evaluate it properly but, the fact that a separate compiler is involved in the process of getting executable code makes me uncomfortable.  That additional compiler is, at least potentially, another source of problems.
Indeed, it was a source of attraction by supporting multiple C compilers. It actually has more than 1 backend, the compiler can generate C, C++, Objective-C and JavaScript. Much like FPC with HL codegen backend only plus Pas2JS.