You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
However, when I ran the dce-mmwave-mptcp example, I got a weird and unreasonable result that was entirely different from the "End-to-End Simulation of 5G mmWave Networks" results. After investigation, I tried to run the example in the V2.0 release, which turned out its result is similar to paper ones. But I realized that in the V2.0 release, the dce-mmwave-mptcp example subframe period was set to 100 microseconds, which is different from the 1 millisecond in the V5.0 and the 3GGP specifications. And then, in V2.0, changing the subframes period from 100 microseconds to 1-millisecond results in the same weird output as the V5.0.
here is the output from iperf:
Weirdly, iperf output is usually 0, which is not the same as the paper result.
here is the throughput of LTE, MMwave, and iperf(Generated by captcp):
There is quite a difference between mmwave throughput and paper throughput.
After these results, I tried to simulate another scenario to examine what is the problem. So I decided to move the building and UE near the base station. However, I got another weird result.
here is the iperf output:
here is the throughput of LTE, MMwave, and iperf(Generated by captcp):
Most of the time, the throughput of iperf is stable, implying that building does not affect throughput, and also, at the start and end of the simulation, the throughput does not go up as in the pictures in the paper.
here is the throughput of MMwave(Generated by wireshark):
As you can see, the building does not have much effect on the throughput, and the throughput is much lower than the paper throughput.
After these results, I come up with several questions:
How can I fix this issue? What is the cause of this problem? How can I get the same results as the paper for subframe period 1-millisecond?
Does this mean the paper simulation settings are slightly different from the 3GGP specification? or because of some release changes?
I tried several settings, but none have a significant impact, like the subframes period on the throughput. Why does the subframe period have such an impact?
Please let me know if you need any more information.
Thank you.
The text was updated successfully, but these errors were encountered:
Hi.
I recently integrated n3-mmwave release V5.0 with DCE 1.11, and you can also see this integration in the links below:
V5.0:
https://github.com/sod-lol/ns-3-dev-git/tree/ns-3.35_with_ns-mmwave-v5.0_with_dce_1.11
https://github.com/sod-lol/ns-3-dce/tree/dce-1.11_integration_with_mmwave-v5.0
V2.0:
https://github.com/sod-lol/ns-3-dev-git/tree/ns-3.29_with_ns-mmwave-v2.0_with_dce_1.10_main-paper
https://github.com/sod-lol/ns-3-dce/tree/dce-1.10_integration_with_mmwave-v2.0
However, when I ran the dce-mmwave-mptcp example, I got a weird and unreasonable result that was entirely different from the "End-to-End Simulation of 5G mmWave Networks" results. After investigation, I tried to run the example in the V2.0 release, which turned out its result is similar to paper ones. But I realized that in the V2.0 release, the dce-mmwave-mptcp example subframe period was set to 100 microseconds, which is different from the 1 millisecond in the V5.0 and the 3GGP specifications. And then, in V2.0, changing the subframes period from 100 microseconds to 1-millisecond results in the same weird output as the V5.0.
here is the output from iperf:
Weirdly, iperf output is usually 0, which is not the same as the paper result.
here is the throughput of LTE, MMwave, and iperf(Generated by captcp):
There is quite a difference between mmwave throughput and paper throughput.
here is the source code:
After these results, I tried to simulate another scenario to examine what is the problem. So I decided to move the building and UE near the base station. However, I got another weird result.
here is the iperf output:
here is the throughput of LTE, MMwave, and iperf(Generated by captcp):
Most of the time, the throughput of iperf is stable, implying that building does not affect throughput, and also, at the start and end of the simulation, the throughput does not go up as in the pictures in the paper.
here is the throughput of MMwave(Generated by wireshark):
As you can see, the building does not have much effect on the throughput, and the throughput is much lower than the paper throughput.
here is the source code:
After these results, I come up with several questions:
Please let me know if you need any more information.
Thank you.
The text was updated successfully, but these errors were encountered: