I think it would be much better to invest in a UPS.
I think the same since I don't know if a solid-state HD would be of any help.
The characters which fill the file are gathered/retrieved character per character with time intervals between actions getting each single character, where the intervals are neither known nor predictable. It's the fear that during the mentioned time intervals anything can happen, worst case may be power failure - although the software may run weeks the machine is not connected to an uninterruptible power supply. This whole combination led to the conclusion to fill the file character per character - to be on the safe side as each gathered character is important.
I had to deal with data loss due to power failure. Because of that you fear to loose some chars but you actually can loose the whole file if it goes corrupt !!!
What we did to keep as much data as possible was start using SQLite and it slowed down the execution during data savings. SQLite is very robust against power failures because every transaction goes directly to the hard drive with the penalty it has (start moving the discs in your HD, move the headers to the location of the file/record, open the file for editing, set the new data, close the file) and though it is robust you can have a data loss but you will not have a corrupt file.
I know you are not trying to use a DB but look these links
https://www.sqlite.org/about.html and
https://www.sqlite.org/transactional.html reading those pages might show you what problems are you facing on the "anything ca happen" part of your problem.
It looks like you are getting data from a sensor in a remote place (were none is around). If not, the problem is basically the same. If the data is generated inside the pc (f.i. random numbers, a simulation or what ever you can imagine) then you will lose something, if it comes from a external hardware then
if the hardware has a persistent buffer then maybe you can recover something.
I would go for a partitioned file model like described before.
But for taking a decision, I think it is important to know when how and at which rate it is acquired.
. How many chars per second? (average Bytes per second, minute, hour)
. Is sort of "regular"? f.i. more or less every 50 milliseconds
. Is it VERY irregular? f.i. hours of low or no activity and then it burst to 50 million chars per second
just the first ones that come to my mind. the rate at which the data is acquired might help you to chose the best approach.