Remember my post from earlier in April 2020? (Getting data into HANA – the easy way)
In it, I described how to use DBeaver to load CSV files into HANA quick and hassle-free. For DBeaver activities like importing or exporting data are so-called Database Tasks. For every execution of a task, DBeaver keeps a log file and these log files can be easily accessed via the Database Tasks Eclipse View:

Cliffhanger
The idea is that one can double-click on the log file entries to see details. And you won’t believe what happened when I did…

The editor window that was supposed to show the log file only shows an out of memory error.
What might be the reason, you ask?
HUGE LOGFILES
No shit, Sherlock! This sort of error occurs, when the log file to be loaded is too large for the current JVM heap limits. But how large is too large and where are those log files saved anyhow?
An optimistic right-click on the log entry brings up a context menu that allows opening the log file folder. Folks, these are the little features that make DBeaver a proper developer tool and not just a fancy UI!

What’s the damage?

Looking at the size of the log files the “mystery” was solved immediately: 3.03 GB of log file data is a tad bit much for a standard Eclipse JVM. In fact, 3 GB of log file is a tad bit much anyhow.
Using trusty tricks of the trade like less revealed that the files contain many, many entries like this:
2020-09-04 11:58:29.972 - Error parsing datetime string:
Text '1995-09-22 00:00' could not be parsed, unparsed text found at index 10
How many is “many, many”? Glad you asked!
> grep 'Error parsing datetime string:' run_202009041158_1.log | wc -l
22489348
Roundabout 22.5 million failed date parsing messages. Yes, I can see how that takes more than 3 GB of space.
And that was that.
What I cannot see, however, is a reason to still keep those log files on my SSD. Delete and swoosh… 6 GB of precious SSD space in my MacBook Pro is available for useful stuff again.
There you go, now you know!