Are you trying to build a cross compiler plugin system? Eg the caller build with fpc 3 the plugin with fpc 2.2.0 or gcc and have them work together?
I'd like to know if I can combine two experiments. One of them is a 3D virtual environment. Add-ons would have to be shown inside this 3D virtual environment. The add-ons are stored in library files and should be able to be built using any compiler. I think the dificult part lies at widgetset level. The first implemented add-on of this kind, would be the maintainer part of Flavouring Control System, which is another experiment. All of them are closed source.
@GetMem
I've looked at the attachement, I need the opposite of that.

The TImage will be the plugin container. The only way the plugin container will show the plugins is by loading their rendered bitmaps from pointers. The pointer will be returned by the plugin upon container's request.
If a handle would be passed to the plugin by the container, the plugin should not use the handle to create visible components.
How to interact with the plugin!? You send the "On..." events(or the messages) of the TImage(plugin container) to the plugin(TForm windows) using a dedicated procedure. For example, when you move the mouse over the TImage(the plugin container) you will use:
procedure TfPluginForm.Image1MouseMove(Sender: TObject; Shift: TShiftState; X, Y: Integer);
//It's just a sketch for presentation purpose, I would avoid the "On..." events.
begin
//Send the mouse move message to the plugin
plugin.sendmessage('mousemove',Shift,X,Y);
{Ask the plugin to update the bitmap stream.
The way the plugin fills the bitmap stream should be of no concern to the plugin container.
No visible components should appear during this process.}
plugin.render;
{Show the plugin rendered image.
The bitmap might contain buttons, comboboxes, new windows, whatever the plugin drawn and saved in that bitmap buffer.}
Image1.loadfrombuffer(plugin.outputimage);
end;
This is an analogy.
The plugin container is like a 3d-shooter genre game, something like Doom or Heretic. The plugin makes some walls interactive.
On various walls, instead of ordinary brick textures a user might also see screens of running applications. These running applications are the plugins of the game. These applications are real, they run in that computer, but their visual output lies on walls as textures, not on user's screen directly. Not only that the image seen by the user in the 3d environment is consistent with what the user would normally see if the application would have been executed in a 2D desktop environment, but also moving and firing "guns" over the wall would be the equivalent of mouse clicks for those applications. Pressing keyboard buttons when the "player" is close to a wall that has such a plugin, will make the container send those keyboard events to the plugin.
As you see, the container must use textures. These textures must be updated by reading a bitmap image from the library(plugin). Mouse and keyboard events must be sent to the plugin by using procedures that are stored inside the library. The visual output of the plugin must be in a bitmap buffer that's read by the plugin container, not outputed directly on the screen by the plugin.
Did the above analogy made things clearer or I've managed to add additional noise over the subject?
