[prev in list] [next in list] [prev in thread] [next in thread]
List: usrp-users
Subject: [USRP-users] USRP N210 C API stream SC16
From: Marco Spanghero <marcosp () kth ! se>
Date: 2022-10-27 11:36:26
Message-ID: 76da335236934c9a94b1213673509b88 () kth ! se
[Download RAW message or body]
Hi,
I am currently facing a problem using the C API with a USRP N210 r4. I am trying to \
stream a int16_t interleave IQ file. When using the included C++ tx_samples_from_file \
demo, the baseband is streamed without interruptions or other issues. The BB is \
received at the DUT and the symbols seem correct.
Unfortunately, integration with a bigger software requires usage of the C API. What \
is unclear is how does the C and C++ API compare in handling the samples? Seems to me \
there is no direct mapping between the std::complex<short> and interleaved int16_t \
for IQ. When analyzed with a spectrum analyzer, the baseband streamed look completely \
different.
Details on the implemented functions and problem:
Issue Description
I am using the C API to stream a binary data file. The file is saved as interleaved \
int16_t [I][Q] samples. I am using an N210r4 with the latest UHD and fpga image. MTU \
on the NIC is configured at 3000, buffer size is 10000 samples. The same file, when \
streamed with the included tx_samples_from_file example works fine and the baseband \
is received correctly. When the equivalent code is written in C, the baseband is not \
correct.
Setup Details
Implemented C code streaming loop:
while (1) {
if (stop_signal_called)
break;
uhd_tx_metadata_make(&md, false, 0, 0.1, false, false);
size_t read = fread(buff, sizeof(int16_t), samps_per_buff, file);
for(int i = 0; i < read; i++){
printf("%d \n", buff[i]);
}
if(read > 0){
uhd_tx_streamer_send(tx_streamer, buffs_ptr, read, &md, 0.1, \
&num_samps_sent); total_num_samps += num_samps_sent;
}
else
break;
if (verbose)
printf("\n Sent %ld - from file %ld\n ", total_num_samps, read);
}
buff containes the data block to stream and is defined as buff = \
malloc(samps_per_buff*sizeof(int16_t));
C metadata
uhd_stream_args_t stream_args = {
.cpu_format = "sc16",
.otw_format = "sc16",
.args = "",
.channel_list = 0,
.n_channels = 1};
Reference C++ streaming loop:
void send_from_file(
uhd::tx_streamer::sptr tx_stream, const std::string& file, size_t samps_per_buff)
{
uhd::tx_metadata_t md;
md.start_of_burst = false;
md.end_of_burst = false;
std::vector<samp_type> buff(samps_per_buff);
std::ifstream infile(file.c_str(), std::ifstream::binary);
// loop until the entire file has been read
while (not md.end_of_burst and not stop_signal_called) {
infile.read((char*)&buff.front(), buff.size() * sizeof(samp_type));
size_t num_tx_samps = size_t(infile.gcount() / sizeof(samp_type));
md.end_of_burst = infile.eof();
const size_t samples_sent = tx_stream->send(&buff.front(), num_tx_samps, md);
if (samples_sent != num_tx_samps) {
UHD_LOG_ERROR("TX-STREAM",
"The tx_stream timed out sending " << num_tx_samps << " samples ("
<< samples_sent << " sent).");
return;
}
}
infile.close();
}
Reference C++ metadata
uhd::stream_args_t stream_args("sc16", "sc16");
channel_nums.push_back(boost::lexical_cast<size_t>(channel));
stream_args.channels = 0;
uhd::tx_streamer::sptr tx_stream = usrp->get_tx_stream(stream_args);
Expected Behavior
I would expect the two samples to perform exactly the same. The baseband should be \
identical
Actual Behaviour
Once shown on a spectrum analyzer, the C example shows a much larger gain an the \
baseband appears fragmented. I don't understand how the C api handles the buffer \
streaming. According to the code the C function wraps the exact same behavior of the \
send call.
Steps to reproduce the problem
I can provide source code for the two examples and binary file. Both samples are \
executed at 2.5Msps
Question
What am I missing here to correctly stream the baseband? My understanding is that \
once the data type is fixed in the streaming metadata, we call uhd_tx_streamer_send \
with the number of samples that we want to stream (which is what the C++ example does \
using as type std::complex<short>. In the C case, how do we achieve the same \
behavior?
I would really appreciate any support in this matter.
Best regards
Marco Spanghero
[Attachment #3 (text/html)]
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<style type="text/css" style="display:none;"><!-- P {margin-top:0;margin-bottom:0;} \
--></style> </head>
<body dir="ltr">
<div id="divtagdefaultwrapper" \
style="font-size:12pt;color:#000000;font-family:Calibri,Helvetica,sans-serif;" \
dir="ltr"> <p><span style="font-family: Calibri, Helvetica, sans-serif, EmojiFont, \
"Apple Color Emoji", "Segoe UI Emoji", NotoColorEmoji, \
"Segoe UI Symbol", "Android Emoji", EmojiSymbols; font-size: \
16px;">Hi, </span></p> <div style="font-family: Calibri, Helvetica, sans-serif, \
EmojiFont, "Apple Color Emoji", "Segoe UI Emoji", NotoColorEmoji, \
"Segoe UI Symbol", "Android Emoji", EmojiSymbols; font-size: \
16px;"> I am currently facing a problem using the C API with a USRP N210 r4. I am \
trying to stream a int16_t interleave IQ file. When using the included C++ \
tx_samples_from_file demo, the baseband is streamed without interruptions or other \
issues. The BB is received at the DUT and the symbols seem correct. </div>
<div style="font-family: Calibri, Helvetica, sans-serif, EmojiFont, "Apple Color \
Emoji", "Segoe UI Emoji", NotoColorEmoji, "Segoe UI Symbol", \
"Android Emoji", EmojiSymbols; font-size: 16px;"> <br>
</div>
<div style="font-family: Calibri, Helvetica, sans-serif, EmojiFont, "Apple Color \
Emoji", "Segoe UI Emoji", NotoColorEmoji, "Segoe UI Symbol", \
"Android Emoji", EmojiSymbols; font-size: 16px;"> Unfortunately, \
integration with a bigger software requires usage of the C API. What is unclear is \
how does the C and C++ API compare in handling the samples? Seems to me there \
is no direct mapping between the std::complex<short> and interleaved int16_t \
for IQ. When analyzed with a spectrum analyzer, the baseband streamed look \
completely different. </div> <div style="font-family: Calibri, Helvetica, \
sans-serif, EmojiFont, "Apple Color Emoji", "Segoe UI Emoji", \
NotoColorEmoji, "Segoe UI Symbol", "Android Emoji", EmojiSymbols; \
font-size: 16px;"> <br>
</div>
<div style="font-family: Calibri, Helvetica, sans-serif, EmojiFont, "Apple Color \
Emoji", "Segoe UI Emoji", NotoColorEmoji, "Segoe UI Symbol", \
"Android Emoji", EmojiSymbols; font-size: 16px;"> Details on the \
implemented functions and problem:</div> <div style="font-family: Calibri, Helvetica, \
sans-serif, EmojiFont, "Apple Color Emoji", "Segoe UI Emoji", \
NotoColorEmoji, "Segoe UI Symbol", "Android Emoji", EmojiSymbols; \
font-size: 16px;"> <div>Issue Description</div>
<div>I am using the C API to stream a binary data file. The file is saved as \
interleaved int16_t [I][Q] samples. I am using an N210r4 with the latest UHD and fpga \
image. MTU on the NIC is configured at 3000, buffer size is 10000 samples. The same \
file, when streamed with the included tx_samples_from_file example works fine and \
the baseband is received correctly. When the equivalent code is written in C, the \
baseband is not correct.</div> <div><br>
</div>
<div><b>Setup Details</b></div>
<div><u>Implemented C code streaming loop:</u></div>
<div><br>
</div>
<div>while (1) {</div>
<div> if (stop_signal_called)</div>
<div> break;</div>
<div><br>
</div>
<div> uhd_tx_metadata_make(&md, false, 0, 0.1, false, \
false);</div> <div> size_t read = fread(buff, \
sizeof(int16_t), samps_per_buff, file);</div> <div> \
for(int i = 0; i < read; i++){</div> <div> \
printf("%d \n", buff[i]);</div> <div> \
}</div> <div> </div>
<div> if(read > 0){</div>
<div> uhd_tx_streamer_send(tx_streamer, \
buffs_ptr, read, &md, 0.1, &num_samps_sent);</div> <div> \
total_num_samps += num_samps_sent;</div> <div> \
}</div> <div> else</div>
<div> break;</div>
<div> if (verbose)</div>
<div> printf("\n Sent %ld - from file \
%ld\n ", total_num_samps, read);</div> <div> }</div>
<div>buff containes the data block to stream and is defined as buff = \
malloc(samps_per_buff*sizeof(int16_t));</div> <div><br>
</div>
<div><u>C metadata</u></div>
<div><br>
</div>
<div>uhd_stream_args_t stream_args = {</div>
<div> .cpu_format = \
"sc16", </div> <div> \
.otw_format = "sc16", </div> <div> \
.args = \
"", </div> <div> .channel_list \
= 0, </div> <div> .n_channels \
= 1};</div> <div><br>
</div>
<div><u>Reference C++ streaming loop:</u></div>
<div><br>
</div>
<div>void send_from_file(</div>
<div> uhd::tx_streamer::sptr tx_stream, const std::string& file, \
size_t samps_per_buff)</div> <div>{</div>
<div> uhd::tx_metadata_t md;</div>
<div> md.start_of_burst = false;</div>
<div> md.end_of_burst = false;</div>
<div> std::vector<samp_type> buff(samps_per_buff);</div>
<div> std::ifstream infile(file.c_str(), \
std::ifstream::binary); </div> <div><br>
</div>
<div> // loop until the entire file has been read</div>
<div><br>
</div>
<div> while (not md.end_of_burst and not stop_signal_called) {</div>
<div> infile.read((char*)&buff.front(), buff.size() * \
sizeof(samp_type));</div> <div> size_t num_tx_samps = \
size_t(infile.gcount() / sizeof(samp_type));</div> <div> </div>
<div> md.end_of_burst = infile.eof();</div>
<div><br>
</div>
<div> const size_t samples_sent = \
tx_stream->send(&buff.front(), num_tx_samps, md);</div> <div> \
if (samples_sent != num_tx_samps) {</div> <div> \
UHD_LOG_ERROR("TX-STREAM",</div> <div> \
"The tx_stream timed out sending \
" << num_tx_samps << " samples ("</div> <div> \
\
\
<< samples_sent << " sent).");</div> <div> \
return;</div> <div> }</div>
<div> }</div>
<div><br>
</div>
<div> infile.close();</div>
<div>}</div>
<div><u>Reference C++ metadata</u></div>
<div><br>
</div>
<div>uhd::stream_args_t stream_args("sc16", "sc16");</div>
<div>channel_nums.push_back(boost::lexical_cast<size_t>(channel));</div>
<div>stream_args.channels = 0;</div>
<div>uhd::tx_streamer::sptr tx_stream = usrp->get_tx_stream(stream_args);</div>
<div>Expected Behavior</div>
<div>I would expect the two samples to perform exactly the same. The baseband should \
be identical</div> <div><br>
</div>
<div><b>Actual Behaviour</b></div>
<div>Once shown on a spectrum analyzer, the C example shows a much larger gain an the \
baseband appears fragmented. I don't understand how the C api handles the buffer \
streaming. According to the code the C function wraps the exact same behavior of the \
send call.</div>
<div><br>
</div>
<div>Steps to reproduce the problem</div>
<div>I can provide source code for the two examples and binary file. Both samples are \
executed at 2.5Msps</div> <div><br>
</div>
<div><b>Question</b></div>
<div>What am I missing here to correctly stream the baseband? My understanding is \
that once the data type is fixed in the streaming metadata, we call \
uhd_tx_streamer_send with the number of samples that we want to stream (which is what \
the C++ example does using as type std::complex<short>. In the C case, \
how do we achieve the same behavior?</div> <div><br>
</div>
<div>I would really appreciate any support in this matter.</div>
<div><br>
</div>
<div>Best regards</div>
<div>Marco Spanghero</div>
</div>
<br>
<p></p>
</div>
</body>
</html>
_______________________________________________
USRP-users mailing list -- usrp-users@lists.ettus.com
To unsubscribe send an email to usrp-users-leave@lists.ettus.com
--===============1077394991644633939==--
[prev in list] [next in list] [prev in thread] [next in thread]
Configure |
About |
News |
Add a list |
Sponsored by KoreLogic