[prev in list] [next in list] [prev in thread] [next in thread] 

List:       flume-user
Subject:    Re: HDFS Sink Error
From:       Shangan Chen <chenshangan521 () gmail ! com>
Date:       2014-01-25 13:21:14
Message-ID: CAC+bTuU_5TkxZToZFo-ivTMv0TYzPHWViYGCFyaC=N23c_i8PA () mail ! gmail ! com
[Download RAW message or body]

Have you applied this patch https://issues.apache.org/jira/browse/FLUME-2172
 ?


On Tue, Jan 21, 2014 at 5:29 PM, Himanshu Patidar <
himanshu.patidar@hotmail.com> wrote:

> I am getting the same error even after building flume with guava-11.0.2
> 
> Thanks,
> Himanshu
> 
> 
> ------------------------------
> Date: Tue, 21 Jan 2014 17:18:06 +0800
> Subject: Re: HDFS Sink Error
> From: chenshangan521@gmail.com
> To: user@flume.apache.org
> 
> 
> you can try this patch https://issues.apache.org/jira/browse/FLUME-2172
> and build flume with guava-11.0.2 the same version as hadoop2.0.x used,
> currently flume use guava-10.0.1, so just change the version
> 
> <dependency>
> <groupId>com.google.guava</groupId>
> <artifactId>guava</artifactId>
> <version>10.0.1</version>
> </dependency>
> 
> 
> On Tue, Jan 21, 2014 at 3:35 PM, Himanshu Patidar <
> himanshu.patidar@hotmail.com> wrote:
> 
> Hi,
> 
> I am trying to feed a single node HDFS cluster.
> 
> But I getting this error :
> 
> 2014-01-21 08:29:44,426 (SinkRunner-PollingRunner-DefaultSinkProcessor)
> [INFO -
> org.apache.flume.sink.hdfs.HDFSDataStream.configure(HDFSDataStream.java:56)]
> Serializer = TEXT, UseRawLocalFileSystem = false
> 2014-01-21 08:29:44,474 (SinkRunner-PollingRunner-DefaultSinkProcessor)
> [INFO -
> org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:219)]
> Creating hdfs://xyz.16.137.81:54545/flume/FlumeData.1390289384427.tmp
> 2014-01-21 08:29:44,878 (SinkRunner-PollingRunner-DefaultSinkProcessor)
> [ERROR -
> org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:422)]
> process failed
> java.lang.UnsupportedOperationException: This is supposed to be overridden
> by subclasses.
> at
> com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
> at
> org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$FsPermissionProto.getSerializedSize(HdfsProtos.java:5407)
>  at
> com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
>  at
> com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
>  at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$CreateRequestProto.getSerializedSize(ClientNamenodeProtocolProtos.java:2371)
>  at
> com.google.protobuf.AbstractMessageLite.toByteString(AbstractMessageLite.java:49)
> at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.constructRpcRequest(ProtobufRpcEngine.java:149)
>  at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:193)
> at $Proxy11.create(Unknown Source)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:601)
> at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>  at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>  at $Proxy11.create(Unknown Source)
> at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:192)
>  at
> org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1298)
> at
> org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1317)
>  at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1242)
> at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1199)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:273)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:262)
> at
> org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:79)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:851)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:832)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:731)
> at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:720)
> at
> org.apache.flume.sink.hdfs.HDFSDataStream.open(HDFSDataStream.java:80)
> at
> org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:227)
> at
> org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:220)
> at
> org.apache.flume.sink.hdfs.BucketWriter$8$1.run(BucketWriter.java:557)
> at
> org.apache.flume.sink.hdfs.BucketWriter.runPrivileged(BucketWriter.java:160)
> at
> org.apache.flume.sink.hdfs.BucketWriter.access$1000(BucketWriter.java:56)
> at
> org.apache.flume.sink.hdfs.BucketWriter$8.call(BucketWriter.java:554)
> at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
> at java.util.concurrent.FutureTask.run(FutureTask.java:166)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)
> at java.lang.Thread.run(Thread.java:722)
> 
> 
> My conf file :-
> 
> Naming the components in this Agent
> ###############################
> httpagent.sources = http-source
> httpagent.sinks = local-file-sink
> httpagent.channels = ch3
> 
> # Define / Configure Source
> ###############################
> httpagent.sources.http-source.type =
> org.apache.flume.source.http.HTTPSource
> 
> httpagent.sources.http-source.channels = ch3
> httpagent.sources.http-source.port = 44444
> 
> 
> # Local File Sink
> ###############################
> httpagent.sinks.local-file-sink.type = hdfs
> httpagent.sinks.local-file-sink.channel = ch3
> httpagent.sinks.local-file-sink.hdfs.path =
> hdfs://xyz.16.137.81:54545/flume
> httpagent.sinks.local-file-sink.hdfs.fileType = DataStream
> #httpagent.sinks.local-file-sink.hdfs.filePrefix = events-
> httpagent.sinks.local-file-sink.hdfs.round = true
> httpagent.sinks.local-file-sink.hdfs.roundValue = 1
> httpagent.sinks.local-file-sink.hdfs.roundUnit = minute
> 
> # Channels
> ###############################
> httpagent.channels.ch3.type = memory
> httpagent.channels.ch3.capacity = 1000
> httpagent.channels.ch3.transactionCapacity = 100
> 
> 
> Apart from that I have these jars for hadoop/hdfs in the /lib folder :-
> 
> commons-codec-1.4.jar
> 
> commons-configuration-1.6.jar
> 
> commons-httpclient-3.1.jar
> 
> hadoop-annotations-2.0.0-cdh4.2.0.jar
> 
> hadoop-auth-2.0.0-cdh4.2.0.jar
> 
> hadoop-client-2.0.0-mr1-cdh4.2.0.jar
> 
> hadoop-common-2.0.0-cdh4.2.0.jar
> 
> hadoop-core-2.0.0-mr1-cdh4.2.0.jar
> hadoop-hdfs-2.0.0-cdh4.2.0.jar
> protobuf-java-2.5.0.jar
> 
> I believe this error is coming from protobuf-java-2.5.0.jar file.
> 
> Any suggestions will be of great help!!!
> 
> 
> 
> Thanks,
> Himanshu
> 
> 
> 
> 
> --
> have a good day!
> chenshang'an
> 
> 


-- 
have a good day!
chenshang'an


[Attachment #3 (text/html)]

<div dir="ltr"><div>Have you applied this patch <a \
href="https://issues.apache.org/jira/browse/FLUME-2172" target="_blank" \
style="font-size:13px;font-family:arial,sans-serif">https://issues.apache.org/jira/browse/FLUME-2172</a> \
?</div> <div class="gmail_extra"><br><br><div class="gmail_quote">On Tue, Jan 21, \
2014 at 5:29 PM, Himanshu Patidar <span dir="ltr">&lt;<a \
href="mailto:himanshu.patidar@hotmail.com" \
target="_blank">himanshu.patidar@hotmail.com</a>&gt;</span> wrote:<br> <blockquote \
class="gmail_quote" style="margin:0px 0px 0px \
0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">



<div><div dir="ltr"><div>I am getting the same error even after building flume with \
guava-11.0.2</div><div><br>Thanks,<div>Himanshu</div><br><br><div><hr>Date: Tue, 21 \
                Jan 2014 17:18:06 +0800<br>Subject: Re: HDFS Sink Error<br>
From: <a href="mailto:chenshangan521@gmail.com" \
target="_blank">chenshangan521@gmail.com</a><br>To: <a \
href="mailto:user@flume.apache.org" \
target="_blank">user@flume.apache.org</a><div><div class="h5"><br><br><div dir="ltr"> \
you can try this patch <a href="https://issues.apache.org/jira/browse/FLUME-2172" \
target="_blank">https://issues.apache.org/jira/browse/FLUME-2172</a><div>and build \
flume with guava-<span \
style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">11.0.2 \
the same version as hadoop2.0.x used, currently flume use guava-10.0.1, so just \
change the version</span><span \
style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px"> \
</span></div>

<div><br></div><div><span \
style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">&lt;dependency&gt;</span><br \
style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">


<span style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">&lt;groupId&gt;com.google.guava&lt;/groupId&gt;</span><br \
style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">


<span style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">&lt;artifactId&gt;guava&lt;/artifactId&gt;</span><br \
style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">


<span style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">&lt;version&gt;10.0.1&lt;/version&gt;</span><br \
style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">


<span style="color:rgb(51,51,51);font-family:Arial,sans-serif;font-size:14px;line-height:20px">&lt;/dependency&gt;</span><br></div></div><div><br><br><div>On \
Tue, Jan 21, 2014 at 3:35 PM, Himanshu Patidar <span dir="ltr">&lt;<a \
href="mailto:himanshu.patidar@hotmail.com" \
target="_blank">himanshu.patidar@hotmail.com</a>&gt;</span> wrote:<br>

<blockquote style="border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex">



<div><div dir="ltr">Hi,<div><br></div><div>I am trying to feed a single node HDFS \
cluster.</div><div><br></div><div>But I getting this error \
:</div><div><br></div><div><div>2014-01-21 08:29:44,426 \
(SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - \
org.apache.flume.sink.hdfs.HDFSDataStream.configure(HDFSDataStream.java:56)] \
Serializer = TEXT, UseRawLocalFileSystem = false</div>

<div>2014-01-21 08:29:44,474 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - \
org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:219)] Creating \
hdfs://xyz.16.137.81:54545/flume/FlumeData.1390289384427.tmp</div>

<div>2014-01-21 08:29:44,878 (SinkRunner-PollingRunner-DefaultSinkProcessor) [ERROR - \
org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:422)] process \
failed</div><div>java.lang.UnsupportedOperationException: This is supposed to be \
overridden by subclasses.</div>

<div>        at com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)</div><div> \
at org.apache.hadoop.hdfs.protocol.proto.HdfsProtos$FsPermissionProto.getSerializedSize(HdfsProtos.java:5407)</div>


<div>        at com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)</div><div> \
at com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)</div><div>


        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$CreateRequestProto.getSerializedSize(ClientNamenodeProtocolProtos.java:2371)</div><div> \
at com.google.protobuf.AbstractMessageLite.toByteString(AbstractMessageLite.java:49)</div>


<div>        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.constructRpcRequest(ProtobufRpcEngine.java:149)</div><div> \
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:193)</div>


<div>        at $Proxy11.create(Unknown Source)</div><div>        at \
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)</div><div>        at \
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)</div>

<div>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)</div><div> \
at java.lang.reflect.Method.invoke(Method.java:601)</div><div>        at \
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)</div>


<div>        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)</div><div> \
at $Proxy11.create(Unknown Source)</div><div>        at \
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:192)</div>


<div>        at org.apache.hadoop.hdfs.DFSOutputStream.&lt;init&gt;(DFSOutputStream.java:1298)</div><div> \
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1317)</div><div> \
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1242)</div>

<div>        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1199)</div><div> \
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:273)</div><div> \
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:262)</div>


<div>        at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:79)</div><div> \
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:851)</div><div>        at \
org.apache.hadoop.fs.FileSystem.create(FileSystem.java:832)</div>

<div>        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:731)</div><div> \
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:720)</div><div>        at \
org.apache.flume.sink.hdfs.HDFSDataStream.open(HDFSDataStream.java:80)</div>

<div>        at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:227)</div><div> \
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:220)</div><div>   \
at org.apache.flume.sink.hdfs.BucketWriter$8$1.run(BucketWriter.java:557)</div>

<div>        at org.apache.flume.sink.hdfs.BucketWriter.runPrivileged(BucketWriter.java:160)</div><div> \
at org.apache.flume.sink.hdfs.BucketWriter.access$1000(BucketWriter.java:56)</div><div> \
at org.apache.flume.sink.hdfs.BucketWriter$8.call(BucketWriter.java:554)</div>

<div>        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)</div><div> \
at java.util.concurrent.FutureTask.run(FutureTask.java:166)</div><div>        at \
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110)</div>

<div>        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603)</div><div> \
at java.lang.Thread.run(Thread.java:722)</div><div><br></div><div><br></div><div>My \
conf file :-</div><div>

<br></div><div><div>Naming the components in this \
Agent</div><div>###############################</div><div>httpagent.sources = \
http-source</div><div>httpagent.sinks = local-file-sink</div><div>httpagent.channels \
= ch3</div>

<div><br></div><div># Define / Configure \
Source</div><div>###############################</div><div>httpagent.sources.http-source.type \
= org.apache.flume.source.http.HTTPSource</div><div><br></div><div>httpagent.sources.http-source.channels \
= ch3</div>

<div>httpagent.sources.http-source.port = \
44444</div><div><br></div><div><br></div><div># Local File \
Sink</div><div>###############################</div><div>httpagent.sinks.local-file-sink.type \
= hdfs</div><div>httpagent.sinks.local-file-sink.channel = ch3</div>

<div>httpagent.sinks.local-file-sink.hdfs.path = \
hdfs://xyz.16.137.81:54545/flume</div><div>httpagent.sinks.local-file-sink.hdfs.fileType \
= DataStream</div><div>#httpagent.sinks.local-file-sink.hdfs.filePrefix = \
events-</div>

<div>httpagent.sinks.local-file-sink.hdfs.round = \
true</div><div>httpagent.sinks.local-file-sink.hdfs.roundValue = \
1</div><div>httpagent.sinks.local-file-sink.hdfs.roundUnit = \
minute</div><div><br></div><div># Channels</div>

<div>###############################</div><div>httpagent.channels.ch3.type = \
memory</div><div>httpagent.channels.ch3.capacity = \
1000</div><div>httpagent.channels.ch3.transactionCapacity = \
100</div></div><div><br></div><div>

<br></div><div>Apart from that I have these jars for hadoop/hdfs in the /lib folder \
:-</div><div><p dir="ltr" \
style="border:0px;vertical-align:baseline;font-size:13px;line-height:1.15;color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">


<span style="padding:0px;border:0px;vertical-align:baseline;font-size:16px;font-family \
:Arial;color:rgb(0,0,0);background-color:transparent">commons-codec-1.4.jar</span></p><p \
dir="ltr" style="border:0px;vertical-align:baseline;font-size:13px;line-height:1.15;color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">


<span style="padding:0px;border:0px;vertical-align:baseline;font-size:16px;font-family \
:Arial;color:rgb(0,0,0);background-color:transparent">commons-configuration-1.6.jar</span></p><p \
dir="ltr" style="border:0px;vertical-align:baseline;font-size:13px;line-height:1.15;color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">


<span style="padding:0px;border:0px;vertical-align:baseline;font-size:16px;font-family \
:Arial;color:rgb(0,0,0);background-color:transparent">commons-httpclient-3.1.jar</span></p><p \
dir="ltr" style="border:0px;vertical-align:baseline;font-size:13px;line-height:1.15;color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">


<span style="padding:0px;border:0px;vertical-align:baseline;font-size:16px;font-family \
:Arial;color:rgb(0,0,0);background-color:transparent">hadoop-annotations-2.0.0-cdh4.<u></u>2.0.jar</span></p><p \
dir="ltr" style="border:0px;vertical-align:baseline;font-size:13px;line-height:1.15;color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">


<span style="padding:0px;border:0px;vertical-align:baseline;font-size:16px;font-family \
:Arial;color:rgb(0,0,0);background-color:transparent">hadoop-auth-2.0.0-cdh4.2.0.jar</span></p><p \
dir="ltr" style="border:0px;vertical-align:baseline;font-size:13px;line-height:1.15;color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">


<span style="padding:0px;border:0px;vertical-align:baseline;font-size:16px;font-family \
:Arial;color:rgb(0,0,0);background-color:transparent">hadoop-client-2.0.0-mr1-cdh4.<u></u>2.0.jar</span></p><p \
dir="ltr" style="border:0px;vertical-align:baseline;font-size:13px;line-height:1.15;color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">


<span style="padding:0px;border:0px;vertical-align:baseline;font-size:16px;font-family \
:Arial;color:rgb(0,0,0);background-color:transparent">hadoop-common-2.0.0-cdh4.2.0.<u></u>jar</span></p><p \
dir="ltr" style="border:0px;vertical-align:baseline;font-size:13px;line-height:1.15;color:rgb(34,34,34);font-family:Arial,Helvetica,sans-serif">


<span style="padding:0px;border:0px;vertical-align:baseline;font-size:16px;font-family \
:Arial;color:rgb(0,0,0);background-color:transparent">hadoop-core-2.0.0-mr1-cdh4.2.<u></u>0.jar</span></p><span \
style="padding:0px;border:0px;vertical-align:baseline;background-color:transparent;font-family:Arial">hadoop-hdfs-2.0.0-cdh4.2.0.jar</span></div>


<div><span style="padding:0px;border:0px;vertical-align:baseline;background-color:transparent"><font \
face="Arial">protobuf-java-2.5.0.jar</font></span></div><div><span \
style="padding:0px;border:0px;vertical-align:baseline;background-color:transparent"><font \
face="Arial"><br>

</font></span></div><div><span \
style="padding:0px;border:0px;vertical-align:baseline;background-color:transparent"><font \
face="Arial">I believe this error is coming from </font></span><span \
style="font-family:Arial;background-color:transparent;font-size:12pt">protobuf-java-2.5.0.jar \
file.</span></div>

<div><span style="font-family:Arial;background-color:transparent;font-size:12pt"><br></span></div><div><span \
style="font-family:Arial;background-color:transparent;font-size:12pt">Any suggestions \
will be of great help!!!</span></div>

<div><span style="font-family:Arial;background-color:transparent;font-size:12pt"><br></span></div><div><br></div><br>Thanks,<div>Himanshu</div></div> \
</div></div> </blockquote></div><br><br clear="all"><div><br></div>-- <br>have a good \
day! <div>chenshang&#39;an</div><div><br></div> </div></div></div></div></div> 		 	   \
</div></div> </blockquote></div><br><br clear="all"><div><br></div>-- <br>have a good \
day! <div>chenshang&#39;an</div><div><br></div> </div></div>



[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic