Transformers CFG
Grammar-constrained text generation with Transformers models
The transformers_cfg library allows you to control the output of language models like GPT-3 by providing a set of rules (grammar) that the generated text must follow. This is useful for generating structured data like code, JSON objects, or any text that needs to conform to specific patterns or rules. The library works with popular language models and provides an easy way to incorporate grammar constraints into the text generation process without modifying the underlying models.
Transformers_cfg is an extension library for the Hugging Face Transformers library that enables grammar-constrained text generation. It provides tools and functionalities to work with context-free grammars (CFGs) for natural language processing tasks involving CFGs. The library supports various Transformer models, including LLaMa, GPT, Bloom, Mistral, and Falcon, and offers features like multilingual grammar support and integration with Text-Generation-WebUI.
active
—
entered showcase: 2024-04-16
—
entry updated: 2024-04-16
This project has not yet been evaluated by the C4DT Factory team.
We will be happy to evaluate it upon request.
Library
Python
MIT