[prev in list] [next in list] [prev in thread] [next in thread] 

List:       openjdk-serviceability-dev
Subject:    Fwd: hprof format question
From:       Simon Roberts <simon () dancingcloudservices ! com>
Date:       2018-10-31 11:07:31
Message-ID: CADsQA7iHo5jqiqbRV4QyYOxhZCVF_r-Pa7RXXio9BuyCyFUEGw () mail ! gmail ! com
[Download RAW message or body]

Hi all, I'm hoping this is the correct list for a question on the hprof
file format (1.0.2)?

I found this information:
http://hg.openjdk.java.net/jdk6/jdk6/jdk/raw-file/tip/src/share/demo/jvmti/hprof/manual.html

and have been working on a small project to read these files. (Yes, I know
that NetBeans/VisualVM and Eclipse both have such libraries, and a number
of other tools have been derived from those, but so far as I can tell, they
all are fundamentally built on the notion of fully decoding everything, and
creating memory representations of the entire heap. I want to pull out only
certain pieces of information--specifically object counts by class--from a
large, ~20Gb, dump file, and those tools just give up the ghost on my
systems.)

Anyway, my code reads the file pretty well so far, except that the file I
want to analyze seems to contradict the specifications of the document
mentioned above. Specifically, after processing about five
HEAP_DUMP_SEGMENTS with around 1.5 million sub-records in each, I come
across some ROOT_JNI_LOCAL records. The first 15 follow the format
specified in the above document (one 8 byte "ID" and two four byte values.)
But the 16th omits the two four byte records (well, it might simply have
more, but visual analysis shows that after the 8 byte ID, I have a new
block tag, and a believable structure. I've actually noticed that several
of the record types defined in this "group" seem to diverge from the paper
I mentioned.

My solution is that if my parser trips, it abandons that HEAP_DUMP_SEGMENT
from that point forward. It doesn't seem to matter much, since I was
looking for object data, and it appears that all of that has already been
handled. However, clearly this is not ideal!

Is there any more detailed, newer, better, information? Or anything else I
should know to pursue this tool (or indeed a simple object frequency by
classname result) from an hprof 1.0.2 format file?

(And yes, I'm pursuing a putative memory leak :)

Thanks for any input (including "dude, this is the wrong list!")
Cheers,
Simon



-- 
Simon Roberts
(303) 249 3613



-- 
Simon Roberts
(303) 249 3613

[Attachment #3 (text/html)]

<div dir="ltr"><div class="gmail_quote"><div dir="ltr"><div dir="ltr">Hi all, I&#39;m \
hoping this is the correct list for a question on the hprof file format \
(1.0.2)?<div><br></div><div>I found this information:  <a \
href="http://hg.openjdk.java.net/jdk6/jdk6/jdk/raw-file/tip/src/share/demo/jvmti/hprof/manual.html" \
target="_blank">http://hg.openjdk.java.net/jdk6/jdk6/jdk/raw-file/tip/src/share/demo/jvmti/hprof/manual.html</a><br \
clear="all"><div><br></div><div>and have been working on a small project to read \
these files. (Yes, I know that NetBeans/VisualVM and Eclipse both have such \
libraries, and a number of other tools have been derived from those, but so far as I \
can tell, they all are fundamentally built on the notion of fully decoding \
everything, and creating memory representations of the entire heap. I want to pull \
out only certain pieces of information--specifically object counts by class--from a \
large, ~20Gb, dump file, and those tools just give up the ghost on my \
systems.)</div><div><br></div><div>Anyway, my code reads the file pretty well so far, \
except that the file I want to analyze seems to contradict the specifications of the \
document mentioned above. Specifically, after processing about five \
HEAP_DUMP_SEGMENTS with around 1.5 million sub-records in each, I come across some \
ROOT_JNI_LOCAL records. The first 15 follow the format specified in the above \
document (one 8 byte &quot;ID&quot; and two four byte values.) But the 16th omits the \
two four byte records (well, it might simply have more, but visual analysis shows \
that after the 8 byte ID, I have a new block tag, and a believable structure. \
I&#39;ve actually noticed that several of the record types defined in this \
&quot;group&quot; seem to diverge from the paper I \
mentioned.</div><div><br></div><div>My solution is that if my parser trips, it \
abandons that HEAP_DUMP_SEGMENT from that point forward. It doesn&#39;t seem to \
matter much, since I was looking for object data, and it appears that all of that has \
already been handled. However, clearly this is not ideal!</div><div><br></div><div>Is \
there any more detailed, newer, better, information? Or anything else I should know \
to pursue this tool (or indeed a simple object frequency by classname result) from an \
hprof 1.0.2 format file?</div><div><br></div><div>(And yes, I&#39;m pursuing a \
putative memory leak :)</div><div><br></div><div>Thanks for any input (including \
&quot;dude, this is the wrong \
list!&quot;)</div><div>Cheers,</div><div>Simon</div><div><br></div><div><br></div><div><br></div>-- \
<br><div dir="ltr" class="m_-7838466769366283700gmail_signature">Simon \
Roberts<div>(303) 249 3613</div><div><br></div></div></div></div></div> </div><br \
clear="all"><div><br></div>-- <br><div dir="ltr" class="gmail_signature" \
data-smartmail="gmail_signature">Simon Roberts<div>(303) 249 \
3613</div><div><br></div></div></div>



[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic