[prev in list] [next in list] [prev in thread] [next in thread] 

List:       koffice
Subject:    Mega documents - what to do?
From:       Werner Trobin <wtrobin () mandrakesoft ! com>
Date:       2000-05-22 5:41:23
[Download RAW message or body]

Hi!

A few minutes ago I decided to *test* the csv import filter
of KSpread >:->

I fetched an old csv document from the lab and tried to import
it (5 columns, 30.000 rows, ';' separated :)

KSpread (in fact, the CSV filter) started to consume enormous
amounts of memory till all the memory was used (about 300MB for
the KSpread process and a "little bit" (100MB) for gdb :)

Is it possible to handle such huge documents in KSpread, or
would this crash even if the CSV filter wouldn't (e.g. by directly
importing the values into the KoDocument?) and leaving out the
creation of a biiiig QDomDocument.

I'm asking because if KSpread would be able to handle this, I'd
change the CSV filter to import directly (KoDocument)...
I know that this is not the preferred solution, but I think I'm
not the only one using huge tables (generated from ADC cards
and some lab software)

-- 
Werner Trobin - wtrobin@mandrakesoft.com

[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic