Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Actual value of the offset regression loss #97

Open
YangJiao1996 opened this issue Apr 28, 2018 · 1 comment
Open

Actual value of the offset regression loss #97

YangJiao1996 opened this issue Apr 28, 2018 · 1 comment

Comments

@YangJiao1996
Copy link

Hello there!
I recently implemented the side-refinement part of CTPN on TensorFlow. During training, I noticed that the regression loss of side-refinement offset part (Lore) is relatively quite smaller than other parts. For example, Lvre is often 100 to 200 times bigger than Lore.
I wonder whether my implementation of the side-refinement part is right or not...

@hcnhatnam
Copy link

hmm...in my implementation of me, Lv(re) is often 50 to 100 times smaller than Lo(re).
My way is as follows:
55678381-e8bd7000-5922-11e9-8008-0dcbf8ec8980
Example: the yellow bbox is ground truth( gt), the black box(bl) is considering anchor and I calculate xside(of bl)=(x_leftside,x_rightside):(wa=16)

  • x_leftside=d1/16(dark green arrow)
  • x_rightside=d2/16(bright blue arrow)
    (d1,d2 is distance(x-axis) of gt center of anchor to the side left and right, respectively )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants