If it is done in the way that Delphi 12.2 does it, I do not have any objections.
It works with an AIprovider mechanism, where they have written providers for currently 3 different commercial AI providers and one local: ollama, which is on github.
You can only use them if you have keys for those providers. That means that Delphi only facilitates AI providers in their IDE but does not offer AI itself and not by default. So it is an IDE plugin that abstracts away the different rest api's.
The same approach could be used for Lazarus, just add a plugin that abstracts away the different engines in providers/connectors so they integrate in the IDE on request.
AI integration in your own programs could already be done with e.g. my example from Christmas 2022
Not to forget Joao Schuler's exploits with models that can also run locally, similar to ollama.
But usually this is about the IDE integration and if that is via a plugin I have no objections.
Embarcadero presents it as a revolution in Delphi 12.2, but really it is not and made me laugh a bit.
What I
suspect Embarcadero did do, but that is a commercial decision, is encourge the commercial providers to explicitly scrape their public resources,i.e their manuals and blogs, but also their forums and some other semi publics that are behind passwords for the providers to give better answers. At least that is what I would have done.
This will also benefit Freepascal and Lazarus users that use AI, because the models are better trained for our common language.