[prev in list] [next in list] [prev in thread] [next in thread] 

List:       flume-user
Subject:    Re: Spark suppress INFO messages per Streaming Job
From:       Gonzalo Herreros <gherreros () gmail ! com>
Date:       2016-02-24 8:14:25
Message-ID: CAM-G-DWZx4b2u+zw_C+30CbzZb8vsj1EZ1hEJ0pQM2Pj5tChfQ () mail ! gmail ! com
[Download RAW message or body]

The way I have done that is by having a copy the spark config folder with
the updated log4j settings and running the job with the flag that points to
that configuration folder.
The drawback is that if you change other Spark settings for the cluster,
that job won't be updated.

I guess other options are linking the config files in that alternative
config folder or maybe adding a log4j configuration in front of the
driver/executor classpath with the -extraClasspath options.

Maybe in the Spark user list people know of better ways.

Gonzalo

On 23 February 2016 at 23:54, Sutanu Das <sd2302@att.com> wrote:

> Community,
>
>
>
> How can I suppress INFO messages  from Spark Streaming job for per job ?
> …….. meaning, I don't want to change the log4j properties for the entire
> Spark cluster but want to suppress just the INFO messages for a specific
> Streaming job perhaps in the job properties file, Is that possible?
>
>
>
> Or, Do I need to write the sc._jvm.Logging function inside our scala code
> to suppress INFO messages of RDDs?
>
>
>
> Please help us, else, the Streaming job output re-direct log is so BIG
> with those INFO messages, our  file system is getting full. Thanks again.
>

[Attachment #3 (text/html)]

<div dir="ltr">The way I have done that is by having a copy the spark config folder \
with the updated log4j settings and running the job with the flag that points to that \
configuration folder.<div>The drawback is that if you change other Spark settings for \
the cluster, that job won&#39;t be updated.</div><div><br></div><div>I guess other \
options are linking the config files in that alternative config folder or maybe \
adding a log4j configuration in front of the driver/executor classpath with the \
-extraClasspath options.</div><div><br></div><div>Maybe in the Spark user list people \
know of better ways.</div><div><br></div><div>Gonzalo</div></div><div \
class="gmail_extra"><br><div class="gmail_quote">On 23 February 2016 at 23:54, Sutanu \
Das <span dir="ltr">&lt;<a href="mailto:sd2302@att.com" \
target="_blank">sd2302@att.com</a>&gt;</span> wrote:<br><blockquote \
class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc \
solid;padding-left:1ex">





<div lang="EN-US" link="#0563C1" vlink="#954F72">
<div>
<p class="MsoNormal">Community,<u></u><u></u></p>
<p class="MsoNormal"><u></u>  <u></u></p>
<p class="MsoNormal">How can I suppress INFO messages   from Spark Streaming job for \
per job ? …….. meaning, I don't want to change the log4j properties for the \
entire Spark cluster but want to suppress just the INFO messages for a specific \
Streaming job perhaps  in the job properties file, Is that possible? \
<u></u><u></u></p> <p class="MsoNormal"><u></u>  <u></u></p>
<p class="MsoNormal">Or, Do I need to write the sc._jvm.Logging function inside our \
scala code to suppress INFO messages of RDDs?<u></u><u></u></p> <p \
class="MsoNormal"><u></u>  <u></u></p> <p class="MsoNormal">Please help us, else, the \
Streaming job output re-direct log is so BIG with those INFO messages, our   file \
system is getting full. Thanks again.<u></u><u></u></p> </div>
</div>

</blockquote></div><br></div>



[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic