[prev in list] [next in list] [prev in thread] [next in thread] 

List:       r-sig-geo
Subject:    [R-sig-Geo] dealing with large spatial data
From:       Ozlem Yanmaz <ozlem.yanmaz () gmail ! com>
Date:       2010-09-24 16:54:25
Message-ID: AANLkTimw+uC4LjEO3HjB1grox=Z7i2x+kUf6u9kCD8Vi () mail ! gmail ! com
[Download RAW message or body]

Dear fellow R users,

I am fairly new to spatial models. I have been using "spdep" package
to model the spatial correlation between my data points. I have used
"dnearneigh" and "knearneigh" functions to get the neighborhood list.
I don't have a problem running the functions when the data is small.
My problem is the data set that I will eventually be working on is
fairly large (more than 50,000 points). Thus I face with memory
problem. Is there a way to speed up this process, like maybe creating
the neighborhoods in clusters and combine them later to get large
weight matrix for the "spautolm" function?

any idea, suggestion is appreciated.

Thanks and Regards

_______________________________________________
R-sig-Geo mailing list
R-sig-Geo@stat.math.ethz.ch
https://stat.ethz.ch/mailman/listinfo/r-sig-geo
[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic