Skip to content

Conversation

@rahulmohan
Copy link
Contributor

No description provided.


def __init__(self, input_feature_size, num_output_logits,
gru_size=128, gru_layers=2, apply_softmax=False):
def __init__(self, sequence_length, input_feature_size, num_output_logits):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GRU size and layers should remain parameters.

super().__init__()
self.num_output_logits = num_output_logits
self.conv1 = nn.Conv1d(input_feature_size, 128, kernel_size=1, padding=0)
self.gru = nn.GRU(128, 16, 1, batch_first=True, bidirectional=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

16 and 128 should be parameters instead of hard-coded.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants