From sqlite-users Fri Feb 27 21:31:37 2009 From: python () bdurham ! com Date: Fri, 27 Feb 2009 21:31:37 +0000 To: sqlite-users Subject: Re: [sqlite] SQLite vs. Oracle (parallelized) Message-Id: <1235770297.12817.1302798539 () webmail ! messagingengine ! com> X-MARC-Message: https://marc.info/?l=sqlite-users&m=123577030307622 Alexey, Thank you for your reply and for sharing your success with SQLite. I'm excited by your results (60x faster). On an informal basis, we've been going back and re-benchmarking some of our old, 'traditional' (Oracle/Informatica) ETL/DW projects and we now believe the majority of these systems could be simplified and made faster by using alternative techniques based on in-memory data processing (definitely) and/or SQLite (we still need to test). Your approach of splitting large data sets sounds similar to what other SQLite users with large data sets seem to be doing. At a high level, this sounds like how one would partition data using Oracle? I'm going to start a new thread on this topic. > With your hardware I think 100Gb dataset is not limit. Good news. I'm looking forward to verifying this over the next month or so. Best regards, Malcolm _______________________________________________ sqlite-users mailing list sqlite-users@sqlite.org http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users