But what I have in mind is not a real scope but an application that shows some (rather complicated) calculated functions in "real time".
First I tried calculations and drawing into an Image on the fly, but that was far too slow (on a 3.5GHz peintium with 8MB RAM).
So I opted for calculating and drawing on a Bitmap ahead of showing the images at fast rate ... and that is where my problems begun ...
I think I will have a look how it is done in game programming, as suggested by several members.
Thanks to all for your help, it was a pleasant surprise to find such a helpful community.
Rik
Techniques used in Game development may very well solve the problem. From what you describe, it seems you had the right idea but ended up going too far with it.
You may not need 6000 bitmaps, just one that you draw on and bitblt onto the screen. That method is usually fast enough for most purposes. Of course, I have no idea how long your calculations take but, if you can pre-calculate what the ending values should be, chances are, a single bitmap in a memory device context will be fast enough. Even on a Pentium at 3.5Ghz, you should be able to get around 60fps using a screen compatible bitmap in a memory device context.
If the calculations are what is slowing the screen updating process down then, what you really need is to take a page out of the cryptomining book and use the GPU to do the calculations (that's no simple task... but, it will be _fast_ if done right.... of course, the speed is directly related to the GPU's capabilities.)
Try the simplest approach first, one screen compatible bitmap rendered in a memory device context bitblt-ed as necessary onto the screen. Odds are reasonably good, you'll get enough speed out of that (except if you're running the code on a single core, 400Mhz dinosaur, in that case, going to church to pray for speed might be the only solution.... and as you know, churches are well known for not offering any guarantees.)