You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
Thank you for providing this useful resource. I would like to know more about the characteristics of this transformer compared to English ones. For example, how many parameters it has and on home many texts it was trained or even the genre and types of text. If possible compare these statistics with a model in English.
Thank you!
The text was updated successfully, but these errors were encountered:
The architecture is the same as the GPT2 (small version). ParsGPT was trained on more than 700K documents collected from many resources and a considerable amount of writing styles.
I will publish more information about the data probably in a few weeks later.
Hi,
Thank you for providing this useful resource. I would like to know more about the characteristics of this transformer compared to English ones. For example, how many parameters it has and on home many texts it was trained or even the genre and types of text. If possible compare these statistics with a model in English.
Thank you!
The text was updated successfully, but these errors were encountered: