Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Graph-grammar decoding #104

Open
ihh opened this issue Nov 20, 2018 · 1 comment
Open

Graph-grammar decoding #104

ihh opened this issue Nov 20, 2018 · 1 comment

Comments

@ihh
Copy link
Member

ihh commented Nov 20, 2018

Find an "ideal" generator by beginning with a beam search, then iteratively & greedily applying the best expansion of a given graph grammar on the current generator, keeping the generator probabilistically normalized & doing one round of EM to optimize the probabilities for each possible graphgram rule application.

@ihh
Copy link
Member Author

ihh commented Nov 20, 2018

Information size of a weighted graph with N nodes and T transitions, with weights specified to a fixed width of W/lg(10) decimal places, is T(2lg(N) + W)
Use something like this as a prior on generators, i.e. a size penalty. Each new generator must score at least this much (on the given acceptor) to be worth exploring. Decoder specifies W, and can set W=0 for non-weighted decoding.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant