Recent

Author Topic: Why does windows slow down when I start multiple executables simultaneously?  (Read 10404 times)

karlsplatzo

  • New Member
  • *
  • Posts: 20
I write my own programs for academic projects, mostly involving numerical analysis.

I have one that I'd like to run over several CPUs simultaneously to get through a few runs.

I've set up a script that runs the program (say 30 times). I duplicate the script (say 8 times, one per CPU) and then run it. Each script runs in a different directory.

If I set off the scripts with a little delay between each one then as long as the program is still running by the time I set off the last one, all works well.

The problem I have is that when multiple versions of the program start simultaneously, Windows seems to slow down to a crawl on starting programs. Once the program gets going though it runs at normal speed. The computer remains slow to start other things to requiring a reboot. The shorter time that my code runs, the worse this is because more instances of the code start at the same time.

To rule out my code causing a memory leak, I reduced it to a "Hello World" type code and this slowdown still happens.

I ruled out the compiler causing a problem by using gfortran (64 bit) and Lazarus (64 bit, console application). The effect is the same.

I wondered if using the same name for each instance of code running was the problem but running a different named version of the executable in each script doesn't help.

The problem arises on two separate computers, one running Windows 7 Home, the other Windows 10 Educational.

Taskmanager shows that CPU and memory usage are low when the computer is slow.

Can anyone help? Has anyone experienced this?

I've asked this in Microsoft Community and was pointed to Lazarus or gfortran forums.

I've asked in Tomshardware. The suggestion there was that the code might be overusing the disk but there's no evidence from Task Manager. It was also suggested that loading libraries into cache could cause a problem. I don't see any memory issues. Besides, the gfortran code was compiled with "-static" so everything is in the executable, I presume my Lazarus code works the same way.


tudi_x

  • Hero Member
  • *****
  • Posts: 532
did you consider running the software on Linux/Unix/Proxmox?
especially if you do not have a GUI why would you need Windows?
there are many advantages on using a Linux installation without even a XWindows installation.
« Last Edit: December 01, 2017, 01:53:24 pm by tudi_x »
Lazarus 2.0.2 64b on Debian LXDE 10

marcov

  • Administrator
  • Hero Member
  • *
  • Posts: 11383
  • FPC developer.
Windows is slow with executing files, but that is more when you measure it over-all.

(e.g. building FPC on windows is slower than on Linux, almost a factor two because of this and the fact that searching for files in directiories is slower).

However that is not really in human-noticable timescales. What IS in human noticable timescales however is the antivirus. You could try to put the generate files in a directory that you exclude from scanning and test if that is faster.

karlsplatzo

  • New Member
  • *
  • Posts: 20
did you consider running the software on Linux/Unix/Proxmox?
especially if you do not have a GUI why would you need Windows?
there are many advantages on using a Linux installation without even a XWindows installation.

Thanks for your reply.

Yes, for serious research I use a linux cluster. However, for educational purposes I am restricted to our PC lab which runs windows. It's also useful to be able to do things on a laptop or home PC too.

(I actually have an MPI code that automates the running of the executables on Linux and use a script to detect multiple processes on Windows but didn't describe that code because the problem of slow startup is independent of that)

karlsplatzo

  • New Member
  • *
  • Posts: 20
Windows is slow with executing files, but that is more when you measure it over-all.

(e.g. building FPC on windows is slower than on Linux, almost a factor two because of this and the fact that searching for files in directiories is slower).

However that is not really in human-noticable timescales. What IS in human noticable timescales however is the antivirus. You could try to put the generate files in a directory that you exclude from scanning and test if that is faster.

Thanks for your reply.

Again I appreciate the advantages of linux but the problem here is that the multiple processes seem to be interacting strongly with each other as they start up and only affect the startup. And then all programs starting are affected on startup. I have to reboot to make the PC useful again.

Once the code get into their number crunching phase, they work fine. Ironically the less the code has to do the worse the problem.

I'll have a look at the anti-virus software though.

KemBill

  • Jr. Member
  • **
  • Posts: 74
another interesting thing : what interpreter do you use for your scripts ?

if it's cmd.exe, be careful with screen output and buffering, in a batch (*.bat *.cmd), displaying text to the screen can raise CPU wait I/O.

try using a '@' before your command, ie '@somecommand.exe'
« Last Edit: December 01, 2017, 07:20:30 pm by KemBill »

ASerge

  • Hero Member
  • *****
  • Posts: 2223
For verification, attach, please, a script with detailed order of actions for its use, replacing the call of your program to calc.exe.

karlsplatzo

  • New Member
  • *
  • Posts: 20
another interesting thing : what interpreter do you use for your scripts ?

if it's cmd.exe, be careful with screen output and buffering, in a batch (*.bat *.cmd), displaying text to the screen can raise CPU wait I/O.

try using a '@' before your command, ie '@somecommand.exe'

Sorry for the delay in replying.

I am using CMD. I tried the "@" and it does remove the output of the executable name when running. Unfortunately it didn't help overall.

I appreciate that there is a delay writing to the screen that's noticable if that's all you do compared to doing nothing at all. Running a "Hello World" 30 times in succession in a script takes a couple of seconds. However, when I try to run 3 or 4 scripts (with 4 or more CPU's) the PC/laptop grinds to a halt. "Hello World" takes minutes to appear on one instance of the code (though it isn't the output that's slow, if the code does some other stuff, it runs normally, it's just the start up that is the problem).

I thought about using Powershell but that's a new ball game to get into right now.
« Last Edit: December 06, 2017, 01:05:59 pm by karlsplatzo »

karlsplatzo

  • New Member
  • *
  • Posts: 20
For verification, attach, please, a script with detailed order of actions for its use, replacing the call of your program to calc.exe.

It's very simple. It would be:
calc.exe
calc.exe
calc.exe
calc.exe
.
.
.
[repeated 30 times]

I haven't actually used calc.exe as that would be doing something different. I have made duplicates of the executable and renamed them so that each script is running a unique executable.

KemBill

  • Jr. Member
  • **
  • Posts: 74
try

Start yourprogram.exe
[30 times]

but, are you forking ?

ASerge

  • Hero Member
  • *****
  • Posts: 2223
It's very simple. It would be:
calc.exe
calc.exe
calc.exe
calc.exe
.
.
.
[repeated 30 times]
I used test.cmd:
@echo off
start "" calc.exe
start "" calc.exe
...
[repeated 30 times]
When I run this file, at the same moment the taskbar is completely filled with buttons, and on the screen, overlapping each other, there are many calculators.
There is no delay.

KemBill

  • Jr. Member
  • **
  • Posts: 74
In batch files, commands are queued ( commands are interpreted line per line), so you are waiting for the current command to end before starting another, by using start on each line you make a sort of "fork" (so you don't have to wait the end of  the last command). Does it help ?

ASerge

  • Hero Member
  • *****
  • Posts: 2223
In batch files, commands are queued ( commands are interpreted line per line), so you are waiting for the current command to end before starting another, by using start on each line you make a sort of "fork" (so you don't have to wait the end of  the last command). Does it help ?
No. The start command doesn't wait, so I gave you this example.
Try it on your computer.

KemBill

  • Jr. Member
  • **
  • Posts: 74
 :o
In batch files, commands are queued ( commands are interpreted line per line), so you are waiting for the current command to end before starting another, by using start on each line you make a sort of "fork" (so you don't have to wait the end of  the last command). Does it help ?
No. The start command doesn't wait, so I gave you this example.
Try it on your computer.

 :o Are you serious ? skim reader  :D
« Last Edit: December 06, 2017, 11:03:38 pm by KemBill »

ASerge

  • Hero Member
  • *****
  • Posts: 2223
:o Are you serious ? skim reader  :D
I apologize, I read it inattentively.

 

TinyPortal © 2005-2018