Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

embeddings from ptnCloudEmbedder and edge informations. #145

Open
Yacovitch opened this issue Aug 1, 2019 · 6 comments
Open

embeddings from ptnCloudEmbedder and edge informations. #145

Yacovitch opened this issue Aug 1, 2019 · 6 comments

Comments

@Yacovitch
Copy link

Hi Loic,

I have couple of questions.

  1. Is embeddings from ptnCloudEmbedder [number of super-points from each batches, dimension of embedded vector]?
  2. Dose embedding contains the edge (connection between superpoints)?
  3. If embedding does not contain, where does model.ecc taks the edge information.
  4. It seems like it iterates 7 times during training with batch size 2 in Semantic 3D data set with trainval dataset. Isn't it supposed to be 8 times because there are 15 data sets?
  5. Is the order of batches selected randomly?

Thank you in advance!

@loicland
Copy link
Owner

loicland commented Aug 1, 2019

  1. Yes, nbatch×nsuperpoints by embedding size
  2. No edge information in the embeddings
  3. In the superpoint graph files you have super edge descriptors. They are rewritten in GIs. Access them with GIs[0].edge_feats
  4. The leader has the drop_last option, which means it drops batches smaller than the batch size
  5. Yes, different at each iteration

@Yacovitch
Copy link
Author

Thank you

@Yacovitch Yacovitch reopened this Aug 1, 2019
@Yacovitch
Copy link
Author

Could you explain a bit more about the output of GIs[0]._edgefeats?
Is it correnct that it is [number of edges (connections between nodes), number of edge features (13)]?

Also, how can I extract labels and adjacency matrix graph that are associated with features (embedding) that goes into model.ecc?

@loicland
Copy link
Owner

loicland commented Aug 2, 2019

  1. Correct
  2. the graphs are stored in an igraph structure in GIs. You can access the graph structure with all the methods of igraphs.

Labels are stored in label_mode and label_vec, and do not go into the ecc model as they are the supervision.

@Yacovitch
Copy link
Author

Hello,
I still have couple of questions. In advance, thank you for your reply.

  1. Is number of edge features from GIs[0].edgefeats filtered edges or unfiltered edges?
  2. Where does the edge filtering happens? Is that coming from partition?
  3. Is it possible that I extract embeddings' edge lists?
  4. You answered the dimension of GIs[0].edgefeats is nbatch×nsuperpoints by embedding size. Is nsuperpoints consistent throughout batches?
  5. From function set_batch in GraphConvInfo.py, there is a for loop for i,G in enumerate(graphs):. What is the purpose of that? I believe batches are separated from torch.utils.data.DataLoader(train_dataset, batch_size=args.batch_size, collate_fn=spg.eccpc_collate, num_workers=args.nworkers, shuffle=True, drop_last=True).

Again, thank you very much

@loicland
Copy link
Owner

loicland commented Aug 3, 2019

1,2. Superedges are filtered in spg_reader in learning/spy.py, so filtered. Removes edges that are too long.
3. Not sure that I understand. Superedges are not embedded. But if you want to extract the filter, they re computed line 42 of learning/modules.py
4. It is, because superpoints are over/sub samples to only contains --ptn_npts (default 128) points. The subsampling is random and different at each run.
5. This loop collates the different superpointgraphs into an easily accessible structure.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants