[prev in list] [next in list] [prev in thread] [next in thread] 

List:       sqlite-users
Subject:    Re: [sqlite] SQLITE_MAX_VARIABLE_NUMBER and .import for very
From:       "Griggs, Donald" <Donald.Griggs () allscripts ! com>
Date:       2008-12-31 16:56:06
Message-ID: 8F808F4B25AF2744835E2BDA28027931209933EF () NCMAILBE2 ! onemisys ! com
[Download RAW message or body]

Regarding:
>> I am sure there is a better way to deal with 12K rows by 2500 
>> columns, but I can't figure it out....

I wonder if you might want to use *sed* or *awk* or *perl* to preprocess
the data before import.

A "master" table could contain the unique person id, plus the fields
that you intend to index and that you are likely to filter upon most
often.  Other tables could exist for the remaining data, and could be
joined on the person id as needed.

This might:
   -- let you avoid a customized version of sqlite
   -- allow your most-used queries to run faster

_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users
[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic