-
Notifications
You must be signed in to change notification settings - Fork 114
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
revise self._unfold2d(x, ws=8) ? #52
Comments
`import torch from modules.dataset.megadepth import megadepth_warper from modules.training import utils from third_party.alike_wrapper import extract_alike_kpts from modules.model_small import UNFLOD_WS def dual_softmax_loss(X, Y, temp = 0.2):
def smooth_l1_loss(input, target, beta=2.0, size_average=True): def fine_loss(f1, f2, pts1, pts2, fine_module, ws=7):
def alike_distill_loss(kpts, img):
def keypoint_position_loss(kpts1, kpts2, pts1, pts2, softmax_temp = 1.0):
def coordinate_classification_loss(coords1, pts1, pts2, conf):
def keypoint_loss(heatmap, target): def hard_triplet_loss(X,Y, margin = 0.5):
` |
Hi @longzeyilang, After a quick review, it seems your updates are in theory correct, the only problem I see is that a 2x2 patch provides too little context for the keypoint head to be effective. What kind of issues are you experiencing? |
HI, I trained my own data. image size about 128*128, and change model` self.block1 = nn.Sequential(
BasicLayer( 1, 8, stride=1),
BasicLayer( 8, 24, stride=1),
BasicLayer( 24, 64, stride=1),
)
self.block2 = nn.Sequential(
BasicLayer(64, 64, stride=2),
BasicLayer(64, 64, stride=1),
BasicLayer(64, 64, stride=1),
)
and forward change as follow:
def forward(self, x):"""
input:
x -> torch.Tensor(B, C, H, W) grayscale or rgb images
return:
feats -> torch.Tensor(B, 64, H/8, W/8) dense local features
keypoints -> torch.Tensor(B, 65, H/8, W/8) keypoint logit map
heatmap -> torch.Tensor(B, 1, H/8, W/8) reliability map
the unflod2d ws change to 2, how to revise keypoint_head ? and how to revise losses.py?
thank you
The text was updated successfully, but these errors were encountered: